I love this video of Steve Jobs from 1997, answering a hard question in an open and honest way. This is what I want to see more of from all our technology companies. Not polish, not script, just great products that they obviously love and can’t wait to share with us, and a real dialog about the choices they made to get there.
Just watch it.
RSS Copyright © Hottest Mobile Phone News & Reviews | PhoneInferno [Steve Jobs: You have to start with the customer experience and work backwards to the technology], All Right Reserved. 2013.
Powered by Readers From RSS 2 Blog
One of the biggest features on the iPhone 5S is the fingerprint scanner on the home button. Sure fingerprint scanning technology has been seen before on Android smartphones, but Apple was the first to implement a reliable, accurate and fast fingerprint scanner in the iPhone 5S.
It’s obvious that many manufacturers are probably trying to catch up to Apple, however, the Korean Herald has quoted an unnamed Samsung official claims that Samsung is “not yet developing the technology”. The only Korean company known to be working on fingerprint scanners for smartphones is Crucialtec who has already implemented on of its models in the Pantech Vega LTE-A, which is an OEM that Samsung happens to own a stake in.
Researchers claim that Crucialtec is at least a year behind Authentec, the company which Apple bought a year ago and whose technology is found in the iPhone 5S, so this could be a sticking point for those hoping for a fingerprint scanner in the Galaxy S5.
While Samsung isn’t yet developing anything in regards to fingerprint scanning, that can change quickly if it turns out that consumers want the feature implemented. Now for those of you who absolutely require a fingerprint scanner on their smartphone, the upcoming HTC One Max is rumored to be featuring the technology.
Do you think fingerprint scanners are a worthwhile feature on smartphones or are you sticking with your password?
While we are still waiting for NVIDIA‘s impressive looking Tegra 4 processor to officially hit the streets and in smartphones, the company is wasting no time teasing what we can expect to see in the future. Not only are we getting a glimpse of the next-gen graphics we’ll likely receive with Tegra 5, but this shows us NVIDIA’s mobile Kepler GPU which we can expect to see in tons of devices next year.
Back in June NVIDIA announced they’ll be licensing out their impressive graphics chipsets and technology portfolio to 3rd parties, mainly for mobile. This means they won’t be sticking to simply creating chipsets like the Tegra 3 and Tegra 4, but also allowing others to license and use the powerful graphics inside said chips.
What this means is the true graphics power behind the Tegra 4 (and the upcoming Tegra 5) can be utilized by other 3rd parties and not just NVIDIA. Fast forward to today, and the green company is now giving us a glimpse of just how stunning and visually impressive that technology is. The same Kepler graphics platform powering the most powerful desktop PC’s has been streamlined into mobile. They’re calling this the Mobile Kelper GPU. It’s a bit confusing for average readers, so we’ll just let these demo videos do the explaining.
Project Logan is what we’ll likely call the NVIDIA Tegra 5, which could power smartphones, TV’s, tablets, and tons of other electronic devices. The demo above is showing the raw graphical performance that the mobile Kepler GPU inside Logan can truly offer. Essentially showing us that near PC quality top-end graphics will be arriving on the mobile scene in the near future.
The new mobile Kepler architecture is so powerful that NVIDIA decided a graph was in order. Showing its potential next to popular devices like the iPhone 4, Galaxy S II, and even the PlayStation 3. It’s safe to say that console-quality graphics (or similar) are coming to mobile devices once and for all next year. Thanks NVIDIA.
Obviously this is still a long, long ways away. In a way we’re a bit confused as to why they’re showing Tegra 5 type features before Tegra 4 has even hit the market on a mass scale. I guess it’s never too early to get the word out and show a products potential. Right? We’re excited to see how NVIDIA’s licensing and Mobile Kepler will shake up the industry in 2014.
Read the original:
RSS Copyright © Hottest Mobile Phone News & Reviews | PhoneInferno [NVIDIA shows us the future with their mobile Kepler GPU technology], All Right Reserved. 2013.
Powered by Readers From RSS 2 Blog
A new report says that the Galaxy Note 3 could feature an improved camera capable to offer users optical image stabilization (OIS) features.
According to etnews, Samsung is apparently working on including OIS capabilities in the unannounced Galaxy Note 3, a feature that should help improve the pictures and videos taken with the phone’s main camera.
OIS is a technology that’s used in various cameras but also in some mobile devices to reduce image blur caused by motion when taking pictures and/or recording videos. In addition to OIS, the Galaxy Note 3 team has apparently also considered including 3x optical zoom capabilities in the phone’s camera, but it looks like that’s not an option for the handset, as it would affect the thickness of the device.
Modern mobile devices already offer software-based image stabilization features but OIS would actually deal with issues caused by motion right at the lens or sensor level.
Obviously, we’re treating such reports as rumors at this time, as there’s no way of confirming them just yet. There are plenty of other reports and rumors detailing the next-gen Galaxy Note model but Samsung is yet to unveil it. The handset is said to be announced at IFA in Germany, which is where its predecessor were also unveiled.
Since Samsung also makes plenty of digital cameras – including the Android-based Galaxy Camera – we’re not surprised that its teams are trying to improve the camera experience of upcoming smartphones, especially flagship devices. After all, it looks like smartphone makers are starting to pay more attention to the other features of a camera phone – see HTC’s UltraPixel and Nokia’s PureView technologies – other than megapixels, and Samsung will surely try to improve the cameras of its smartphones.
Samsung is also said to release a Galaxy S4 Zoom camera phone in the near future, a device that would pack better camera features. The device is rumored to offer a 16-megapixel sensor and 10x optical zoom, and it will certainly be interesting to see whether it will also feature OIS capabilities.
Once again Samsung is back showing off their impressive new flagship smartphone with some clever advertising, while also showing they have plenty of money to burn. Could you stare at[...]
Here is the original post:
RSS Copyright © Hottest Mobile Phone News & Reviews | PhoneInferno [GALAXY S 4 eye-tracking technology challenges you to a stare down], All Right Reserved. 2013.
Powered by Readers From RSS 2 Blog
Japan Display Inc. (JDI) has unveiled its latest display technology, a 5-inch 1080p TFT LCD display with integrated touch functionality. Called “Pixel Eyes,” the display offers certain advantages including slimmer modules and lower optical reflections.
JDI is a joint venture between Sony, Hitachi and Toshiba, and its panels have already been used in Xperia Z handsets, however a problem noted by many reviewers was the poor viewing angles. JDI has noted that displays need to improve in some aspects and offer “higher resolutions, wider viewing angles, higher picture quality, lower power consumption, and thinner module thickness.”
The improvements this new technology will bring to the table include a 10 to 30% slimmer module, clearer picture quality and increased module brightness. At this point in time, it appears as if Pixel Eyes is not using the recently announced triluminos display tech.
When it comes to specs, the Pixel Eyes will feature a “transmissive IPS” display mode and offer a resolution of up to 1080p, 445 pixel per inch (ppi) density, 450cd/m2 brightness, contrast ratio of 1000:1 and a viewing angle of 160 degrees.
JDI says that displays in resolutions of 720p (720 x 1280 pixels) and qHD (540 x 960 pixels) will be mass produced starting with June 2013, with the 5-inch Full-HD displays being produced soon after. JDI will exhibit the display at the Society for Information Display (SID) Display Week 2013, in Japan this week.
Are you interested in this new display tech? Hope to see the screens in Sony’s next flagship devices?
The way we interact with our devices determines more than what we can do with them. Smartphones became popular not just because of their ability to connect us with the rest of the world, but also because of the connections we built with them, thanks primarily to touch input. Just as the mouse accelerated portable computing, touch input accelerated the growth of smartphones and tablets. When Steve Jobs walked up to announce the iPhone, he spoke of the connection we were about to experience thanks to touch input.
He was right too, you’re more likely to use a device that you feel connected to, rather than one that makes you feel alienated. A keyboard and mouse seem so disconnected, but actually touching the screen of your phone brought a physical connection to the table. Apple may not have been the first company to use touch as an input method, but it was certainly the most successful. Accelerate to the year 2013 and a new trend is appearing in the form gesture-based interaction, showing us just how quickly technology adapts and changes.
But can gesture-based input methods emulate or even surpass the connections we feel with touch input? What sort of applications will gesture control bring? Read on, as we take a three-dimensional adventure into the world of gesture based interaction.
Possibly the most popular form of gesture-based input is Kinect. It’s easy to forget that Kinect is only a little over two years old and in that short period of time, Microsoft has sold over 24 million units.
Kinect utilises an RGB camera, a depth sensor and multi array microphone, allowing it to provide full-body 3D motion capture, as well as voice and facial recognition. For a good look at just how Kinect works, check out the video below:
Kinect has some wonderful applications away from gaming and Skyping, especially in the field of medicine. Researchers at the University of Minnesota have used Kinect to measure disorder symptoms like autism and obsessive-compulsive disorder, in children. Kinect’s potential is sure to expand as more developers jump on board and with the Xbox 720 coming soon, the Kinect 2 may just be on the way too.
Smaller than an iPhone and thinner than a Macbook Air, Leap Motion is a nifty gesture based device that plugs into your PC via USB, and attempts to bring the desktop back into the 21st century.
Using hand gestures, you are able to control your PC just like you would with a touchscreen or a mouse, but what is revolutionary about this product is that the gestures are based on actions we do in everyday life. If you’re in the mood to transform your room into Hogwarts, check out the demonstration video below:
Mum’s the word when it comes to the exact technology embedded within the Leap controller, but what the developers will tell us is that it can track in-air movements down to 1/100th of a millimeter, meaning it is 200 times more sensitive than Kinect. WOW!
Leap Motion also has a few big names signed up to use its technology, with ASUS and HP pairing up with the company to bundle the technology in their PCs. Leap Motion also plans to bring the technology to tablets and phones, so I’m definitely holding my breath. With it’s ability to sense multiple fingers, hands and objects, Leap has an incredible future ahead of it. The implementations are endless, from the boardroom table to the emergency room, the future is bright.
Samsung and all the S-(insert name here) stuff
Samsung has shown an incredible amount of interest in gesture-based interaction, beginning with the Samsung Galaxy S2 and becoming an ever present feature in the Galaxy S3 and Galaxy S4. Gesture input was even ported to Samsung’s Smart TV line up. What began as a simple “turn to mute” gesture, turned into an asphyxiation with gesture based interfacing that was heightened when the S4 was announced.
If the plethora of camera features weren’t enough to satisfy your insatiable hunger, than the over abundance of ways to control the Galaxy S4 were sure to calm your senses. The features from the S2 and S3 remained, but they were taken to new levels, with “Air View” and “Air Gesture”, proving that you didn’t even have to touch your phone to interact with it. Perfect for those countless times you’ve had suntan lotion, or juicy ribs sauce slathered on your fingers. Check out Samsung’s Galaxy S4 advertisement below, if you’re not truly convinced that “Air Gestures” are the future of mobile interaction.
The technology game is a fast moving business and Samsung isn’t resting on its laurels, so it has already began developing a method of interaction using nothing but your mind. This could help people with disabilities better interact with their phones and give them better access to the internet. If you want to learn more on how Samsung is planning on transforming us all into Professor X, check out the full article here.
A major difference between SixthSense and other gesture-based technologies, is that its goal is to merge the physical and digital world into one. What began as a simple contraption using nothing but the rollers in a mouse and some pulleys, has transformed into a neck worn pendant, complete with a projector and a camera.
SixthSense allows you to convert a paper map into a digital one, transform a piece of paper into a tablet and pull information of pieces of paper and into your computer. Through gestures SixthSense can take photos, zoom in or pan on a map and even transform your wrist into an analog watch. Check out founder Pranav Mistry’s TED talk for a complete look into the fascinating technology.
Gesture-based interaction is here to stay. Interacting with your devices in 3D space is a special, almost surreal kind of feeling and the applications for gesture based input are limitless. With brilliant contraptions like Leap Motion and SixthSense, the future looks dazzlingly bright for gesture-based input.
Do you ever use Kinect on your Xbox? How about “Air Gestures” on your Galaxy S4? Interested in Leap and SixthSense? Let us know in the comments below.
Danish eye-tracking software firm The Eye Tribe has announced the launch of its eye control technology for Android smartphones and tablets. The company made the announcement today during the DEMO Mobile 2013 event, also noting that its developer SDK would be available in June. However, sign-ups have started today, and the technology looks promising.
The startup, founded by former PhD students from the IT University of Copenhagen, says that its eye control technology can be used in a variety of ways, including eye-activated logins, which should make things even more secure than most login technologies. The Eye Tribe CEO and co-founder Sune Alstrup Johansen mentioned that the technology’s accuracy is equal to a fingerprint due to its sub-millimeter pupil tracking.
Eye-control technology is not exactly new, given that companies like Samsung do employ some form of eye-tracking in their latest flagship devices. However, The Eye Tribe wants to help developers build new applications and uses for eye-tracking, which can include the aforementioned user login and gaze-bazed controls. App developers can even use eye tracking to determine engagement, which can come in useful when researching which apps or designs get the most attention of users.
The technology is not compatible with all Android devices, though, due to some hardware requirements. However, The Eye Tribe says the required additions will only cost an additional for manufacturers, which means it should not be very difficult for smartphone and tablet makers to introduce eye-tracking technology in their devices.
The Eye Tribe earlier received 0,000 in seed funding from a plethora of European investors. The Danish National Advanced Technology Foundation also provided the software firm with a .3 million grant for a three-year project, and it will not take an equity stake in any of the partnering companies “in the name of job creation and innovation in Denmark.”
Qualcomm announced via its blog on Feb. 14 that some of its Snapdragon chips in devices have a new technology they call Quick Charge 1.0. This charging technology makes devices charge 40% faster than they normally would.
A device without a Snapdragon chip with Quick Charge 1.0 can take up to four hours to get a full charge, this means that you’re using your device much less. With Quick Charge 1.0, your device becomes “truly mobile” and can take three hours or less to get a full charge. The beautiful thing about Quick Charge is that this is all possible through existing USB charging accessories; new cables and chargers are not necessary for it to work.
At the end of Qualcomm’s blog post the company mentioned that this technology is available in more than 70 Snapdragon-based devices today, including the Galaxy S3. Not only that, but Qualcomm also hinted at even faster charging technology in the works by saying, “come back next week for exciting news on the newest advancement in fast charging technology.”
Qualcomm is truly innovating with this technology, which it acquired in June of 2012. Most of us have become so attached to our smartphones that leaving it alone on a charger for four hours is a strenuous task. Be sure to check out Qualcomm’s blog post to see a full list of devices that have the Snapdragon chip with Quick Charge 1.0!