broken

Broken may refer to: read more at WikiPedia

  • apple-iphone.3gs.repair.diy

    I will show you how to repair a cracked or broken iPhone 3G, iPhone 3GS, iPhone 4, or iPod Touch screen yourself.

    In brief, what you need to know

    • The 3G and 3Gs screens are different! you can not mount an iPhone 4 screen on a 3G(s)
    • There is NO 3rd party or lower quality screens, all 3rd party repair company just LIE.
    • Different kit are available:
      • LCD only for 12€ to repair a broken LCD screen only: the iPhone has an intact glass screen, however the image behind the glass screen is absent, broken or distorted.
      • Glass + digitizer for 22€ This is for an iPhone 3G device with a broken glass screen or malfunctioning touch screen with the image (LCD Screen) behind the screen still intact.
      • Glass + digitizer + LCD + button + speaker mounted for 42€ the most expensive but the easiest to replace
    • Duration: 30, 15 or 2 minutes depending on the kit

    I went for the most expensive Kit on ebay.fr, which allow you to replace the broken screen by removing 2 screws, an connecting 3 cables!

    By the way I would NEVER send any phone to anyone, how do you think all these private pictures land on forums? removing the sim card is not enough, there are tools to fetch data even after the have been removed!

    Glass + digitizer + LCD + button + speaker kit mounted with all required tools to perform the repair

    apple-iphone-screen-repair-kitapple-iphone-screen-repair-kit2

    Step 1

    Remove the 2 small Phillips screws located at the bottom of the iPhone

    apple-iphone-screen-repair-step1

    Step 2

    Using the suction cups, pull up the screen upper part while retaining the body, it will come with no efforts

    apple-iphone-screen-repair-step2apple-iphone-screen-repair-step3

    Step 3

    Look at the connectors that you will have to pull off using the tool. Apple numbered the black ribbon cables 1, 2, and 3. The ribbon 3 being hidden under the cable that connect to 2. Just pull the cables in that order 2,1 and 3 off,

    apple-iphone-screen-repair-step4apple-iphone-screen-repair-step5

    Step 4

    Connect the new screen by plugging in that order 3, 1 and 2. Verify proper operations before closing with the 2 Philips screw the case back.

    apple-iphone-screen-repair-step6apple-iphone-screen-repair-step7

  • CedThumbnails has been updated to version 2.5.9 and contains 4 new features and correct some bug for  Joomla 2.5. For existing users the update will display in the Extensions Managerunder Updates. If you do not have this currently installed, you can click the link below and install like you are use to via the Extensions Manager.

    NEW: The function that detect images in your articles is now clever and now support better fallbacks. It now support Joomla articles metadata (intro image and full article image). With the first option selected the system will always find at least one image to render as thumbnail. Priority is from left to right.

    • search in intro text -> use intro image -> in full text -> use full article image (NEW) it’s the default,
    • search in intro text only,
    • search in intro text -> use intro image (NEW),
    • search in full text only,
    • search in full text -> use full article image (NEW),
    • search in intro text -> in full text,
    • search in full text -> in intro text,

    If no images are found, despite going through intro text, full text, intro image and full article image, the system will fallback to a default image that can be set up per modules/plugins.

    NEW: Support for alternate images text and caption.

    NEWSupport for new resizing method:

    • inside: image fits the given dimensions from the inside, aspect ratio will be kept.
    • outside: image will be at least as big as X x Y, aspect ratio will be kept.
    • fill: image will be stretched as necessary, aspect ratio may not be kept. it’s the default resizing method.

    NEW: Support for scaling method, this determines when to scale an image:

    • any: resize regardless of the image size. it’s the default scaling method.
    • up: resize if image is smaller than the new dimensions.
    • down: resize if image is larger than the new dimensions.

    On a side note the code has been improved, no static methods, better decoupling, but that’s another story that interest only developers. Anyway adding new features will be easier!

  • Apple IPhone don’t fly well, and here is a picture to prove it:

    broken.apple.iphone.3gs

    I will this week end publish an online “how to exchange your iPhone 3GS LCD screen yourself”

    • The Glass can be found on ebay for 22 euros
    • The whole assembled kit: Glass + LCD + frame + home button + speakers + kit of 7 tools for less than 42 euro! this is the kit I ordered yesterday online.

    iphone.3gs.screen.repair.kit

    By the way, Apple ask a little bit more than 200 CHF for replacing the glass.

  • There are “Still” plenty of reasons to jailbreak that aren't directly related to iPhone-specific hardware.

    • Cydia, an APT frontend for your iPhone, get all applications, most are open source or refused by Apple!
      By the way, Cydia was offering cut&paste, video editing, back grounding, MMS and more years before Apple gives us the OS 3.0. What’s coming in OS 4.0? skins and themes :-)
    • Winterboard, apply themes and skins to get more funny spring board, for example with the little big planet skins, they are really hundreds of them!
      &160;little.big.planet.cydia cydia.theme
    • Live Clock and Weather icons,
    • SBSettings,a customizable HUD with handy settings switches accessed by swiping across the status bar in any app or in the home screen, numerical status displays,
    • Customizable system fonts,
    • Multitasking support, dead simple: press Home 2 second to put app in background, 2 second more to disabled background, I did not see any huge power drain differences by running 2 instant messaging apps: Skype and Nimbuzz 2 hours without interruption. The only real issue that must be confirmed is that the iPhone get really hot in the back. I would suspect that Apple is aware of this problem and did not switch back grounding application for this reason.
    • Full access to the file system, normal users don’t need that, but developers find it fun!
    • Terminal, SSH (openSSH yes), VNC and VIM, I can now remotely connect to my host with my RSA keys but more on that later..

    How to Jailbreak your iPhone/iPod 3GS

    I MAKE NO SUPPORT, USE AT YOU OWN RISK

    I just want to share here how simple and how easy it was for ME, it doesnt mean it will work for YOU. Search in Google for a solution in case of disaster (iPhone wont boot up), it seems a backup/restore may save you time and eager.

    Jailbreak with Purplera1n http://purplera1n.com&160;

    Run the exe while having your iPhone connected through USB, click on the button “make it ra1n”

    The iPhone will reboot and a new icon “Freeze” will appear. Click on it to install Cydia [WikiPedia]. The operation take less than 1 minute

    NOTE:

    • You can also afterward install cracked Applications by Drag & Drop into iTunes or by double clicking them and Sync with your iPhone/iPod Touch !!!
    • I recommend you to install “Pandora Box”, a free app that run on any non jail broken iPhone and let you review every day or week all paid applications that get a price drop (paid to free, paid drop in prices ). There is enough (crap) to download for free before paying of even hacking the iPhone.
  • Microsoft and Perceptive Pixel Inc. (PPI) today announced that they have entered into a definitive agreement under which Microsoft will acquire PPI, a recognized leader in research, development and production of large-scale, multi-touch display solutions.

    Jeff Han shows off a cheap, scalable multi-touch and pressure-sensitive computer screen interface that may spell the end of point-and-click.

    Who did copy what? looking at Apple’s Timeline it’s pretty clear:

    • … was developed way before
    • 2006 Perceptive Pixel Inc. (PPI)  demo
    • June 29 2007 iPhone
    • July 11 2008 iPhone 3G
    • June 19 2009 iPhone 3GS
    • June 24 2010 iPhone 4
    • October 14 2011 iPhone 4S

    From http://www.ted.com/talks/jeff_han_demos_his_breakthrough_touchscreen.html

    I'm really, really excited to be here today, because I'm about to show you some stuff that's just ready to come out of the lab, literally, and I'm really glad that you guys are going to be amongst the first to be able to see it in person, because I really, really think this is going to change -- really change -- the way we interact with machines from this point on.

    Now, this is a rear-projected drafting table. It's about 36 inches wide and it's equipped with a multi-touch sensor. Now, normal touch sensors that you see, like on a kiosk or interactive whiteboards, can only register one point of contact at a time. This thing allows you to have multiple points at the same time. They can use both my hands; I can use chording actions;I can just go right up and use all 10 fingers if I wanted to. You know, like that.

    Now, multi-touch sensing isn't completely new. I mean, people like Bill Buxton have been playing around with it in the '80s. However, the approach I built here is actually high-resolution, low-cost, and probably most importantly, very scalable. So, the technology, you know, isn't the most exciting thing here right now, other than probably its newfound accessibility. What's really interesting here is what you can do with it and the kind of interfaces you can build on top of it. So let's see.

    So, for instance, we have a lava lamp application here. Now, you can see, I can use both of my hands to kind of squeeze together and put the blobs together. I can inject heat into the system here, or I can pull it apart with two of my fingers. It's completely intuitive; there's no instruction manual. The interface just kind of disappears. This started out as kind of a screensaver app that one of the Ph.D. students in our lab, Ilya Rosenberg, made. But I think its true identity comes out here.

    Now what's great about a multi-touch sensor is that, you know, I could be doing this with as many fingers here, but of course multi-touch also inherently means multi-user. So Chris could be out here interacting with another part of Lava, while I kind of play around with it here. You can imagine a new kind of sculpting tool, where I'm kind of warming something up, making it malleable, and then letting it cool down and solidifying in a certain state.Google should have something like this in their lobby. (Laughter)

    I'll show you something -- a little more of a concrete example here, as this thing loads. This is a photographer's light box application. Again, I can use both of my hands to interact and move photos around. But what's even cooler is that if I have two fingers, I can actually grab a photo and then stretch it out like that really easily. I can pan, zoom and rotate it effortlessly.I can do that grossly with both of my hands, or I can do it just with two fingers on each of my hands together. If I grab the canvas, I can kind of do the same thing -- stretch it out. I can do it simultaneously, where I'm holding this down, and gripping on another one, stretching this out like this.

    Again, the interface just disappears here. There's no manual. This is exactly what you expect, especially if you haven't interacted with a computer before. Now, when you have initiatives like the $100 laptop, I kind of cringe at the idea that we're going to introduce a whole new generation of people to computing with this standard mouse-and-windows-pointer interface. This is something that I think is really the way we should be interacting with machines from this point on. (Applause) Now, of course, I can bring up a keyboard. And I can bring that around, put that up there. Now, obviously, this is kind of a standard keyboard,but of course I can rescale it to make it work well for my hands. And that's really important, because there's no reason in this day and age that we should be conforming to a physical device. That leads to bad things, like RSI. We have so much technology nowadays that these interfaces should start conforming to us. There's so little applied now to actually improving the way we interact with interfaces from this point on. This keyboard is probably actually the really wrong direction to go. You can imagine, in the future, as we develop this kind of technology, a keyboard that kind of automatically drifts as your hand moves away,and really intelligently anticipates which key you're trying to stroke with your hands. So -- again, isn't this great?

    Audience: Where's your lab?

    Jeff Han: I'm a research scientist at NYU in New York.

    Here's an example of another kind of app. I can make these little fuzz balls. It'll remember the strokes I'm making. Of course I can do it with all my hands. It's pressure-sensitive, you can notice. But what's neat about that is, again, I showed you that two-finger gesture that allows you to zoom in really quickly. Because you don't have to switch to a hand tool or the magnifying glass tool, you can just continuously make things in real multiple scales, all at the same time. I can create big things out here, but I can go back and really quickly go back to where I started, and make even smaller things here.

    Now this is going to be really important as we start getting to things like data visualization. For instance, I think we all really enjoyed Hans Rosling's talk, and he really emphasized the fact that I've been thinking about for a long time too: we have all this great data, but for some reason, it's just sitting there. We're not really accessing it. And one of the reasons why I think that is, is because -- we'll be helped by things like graphics and visualization and inference tools, but I also think a big part of it is going to be starting to be able to have better interfaces, to be able to drill down into this kind of data, while still thinking about the big picture here.

    Let me show you another app here. This is something called WorldWind. It's done by NASA. It's a kind of -- we've all seen Google Earth; this is an open-source version of that. There are plug-ins to be able to load in different data sets that NASA's collected over the years. But as you can see, I can use the same two-fingered gestures to go down and go in really seamlessly. There's no interface, again. It really allows anybody to kind of go in -- and, it just does what you'd expect, you know? Again, there's just no interface here. The interface just disappears. I can switch to different data views. That's what's neat about this app here. There you go. NASA's really cool. They have these hyper-spectral images that are false-colored so you can -- it's really good for determining vegetative use. Well, let's go back to this.

    Now, the great thing about mapping applications -- it's not really 2D, it's kind of 3D. So, again, with a multi-point interface, you can do a gesture like this -- so you can be able to tilt around like that, you know. It's not just simply relegated to a kind of 2D panning and motion.Now, this gesture that we've developed, again, is just putting two fingers down -- it's defining an axis of tilt -- and I can tilt up and down that way. That's something we just came up with on the spot, you know; it's probably not the right thing to do, but there's such interesting things you can do with this kind of interface. It's just so much fun playing around with too. (Laughter)

    And so the last thing I want to show you is -- you know, I'm sure we can all think of a lot of entertainment apps that you can do with this thing. I'm a little more interested in the kind of creative applications we can do with this. Now, here's a simple application here -- I can draw out a curve. And when I close it, it becomes a character. But the neat thing about it is I can add control points. And then what I can do is manipulate them with both of my fingers at the same time. And you notice what it does. It's kind of a puppeteering thing, where I can useas many fingers as I have to draw and make --

    Now, there's a lot of actual math going on under here for this to control this mesh and do the right thing. I mean, this technique of being able to manipulate a mesh here, with multiple control points, is actually something that's state of the art. It was just released at Siggraph last year, but it's a great example of the kind of research I really love: all this compute power to apply to make things do the right things, intuitive things, to do exactly what you expect.

    So, multi-touch interaction research is a very active field right now in HCI. I'm not the only one doing it; there are a lot of other people getting into it. This kind of technology is going to let even more people get into it, and I'm really looking forward to interacting with all you guysover the next few days and seeing how it can apply to your respective fields. Thank you.(Applause)