Wearable technology just keeps on making the headlines, partly thanks to Samsung’s newly announced Gear 2 and 2 Neo smartwatches. As well as software features and aesthetic designs to get right, we seem to be forgetting that we still need processors to power our wearables. The battle for who will control the potentially huge wearable technology market is well and truly on.
We’ve already hit on the fact that wearable computing requires a new range of power efficient, and above all else, smaller SoCs than you’d find in a smartphone. ARM’s Cortex A and M chips are already well placed to deal with this emerging market segment, and Intel has also started to make a move with its own Quark processor and Edison development board. ARM is already a ways out in front here too, whilst Edison is a bit too large and impractical at the moment. Furthermore, ARM has also spotted another, rather large, flaw with Intel’s technology – it’s too damn hot!
According to a test conducted in ARM’s own labs, the company’s Cortex-A9 processor, the same core which has found its way into a number of dual and quad-core smartphone SoCs, can been seen running at a very cool 28.4oC without any cooling solutions applied. That’s pretty much perfect for something you’re likely to wear on your wrist.
The same can’t be said when it comes to Intel’s turn, its Galileo development board, the predecessor to the new Edison development kit which are both powered by an Intel Quark X1000 chip, reaches a blistering 54.9 oC whilst just being used to turn a light on and off. Ouch!
To be fair to Intel, the board would be slimmed down substantially before being used in any product, and some of the heat is probably being generated by additional bits of hardware on the board, including whatever is converting the mains voltage to 5V (ARM’s chip is just running from a battery). Never the less, ARM raises an excellent point regarding the temperature of our wearable technology. Wearable technology not only necessitates smaller and more power efficient chips, but also SoCs which will run cool when left on for long periods of time, whilst still being able to power all the things we want our smartwatches to do. Talk about a tall order.
Check out the video below for a closer look at ARM’s temperature test, as well as some cool thermal imaging clips (badum tish).
Playing up the hype that will lead to its unveiling this month, [#protected_0#] is little by little divulging features of its upcoming G Pro 2 smartphone. This time the Korean manufacturer is focusing on its camera with a feature that it simply calls OIS Plus.
OIS or Optical Image Stabilization is a technology that is becoming one of the most wanted feature in cameras on smartphones. After all, even the highest megapixel lens can be foiled by a shaky hand. LG’s current G2 flagship debuted with OIS for its 13 megapixel camera. Now LG is revealing that the G Pro 2 will also have OIS and a bit more.
What that bit will be, LG still hasn’t fully explained except as something better than plain OIS in correcting blurring due to shaky hand movement. It supposedly also works great even in low light situations. But while LG is keeping mum on the plus of its OIS technology, it is revealing all about the two cameras on the G Pro 2. The front-facing camera will have a 2.1 megapixel lens with improved performance. The rear camera sports 13 megapixels capable of shooting UHD videos with resolutions of 3840×2160. It can also shoot in burst mode of up to 20 photos and can even do slow motion capture at 1/4 speed but only in HD quality.
LG is still keeping parts of the G Pro 2 in the dark for now, but it is believed to be a large 6-inch phablet. If previous press releases are any indication, LG may incrementally reveal bits and pieces of the smartphone before it finally announces the whole package later this month.
Charlottesville, Virginia – Niles Technology is often asked how best to make students better writers. Our answer is simple – with your BYOD program, download and implement our apps to teach the intricacies of writing. Many writing classes are not as effective as possible because of limited step-by-step instructional content to guide students. With Niles iPad and iPhone apps, students never get lost when prewriting and developing their ideas, thereby making the writing of the final essay an easier and more enjoyable task.
Students love the iPad and iPhone, and they center their lives around them. These devices have also changed education with access to many mobile apps; therefore, there are no more excuses to getting students motivated about schoolwork. There is an app for any subject matter and since students like using mobile devices, it just follows that schools use BYOD (bring your on device) programs to excite students about learning. Niles Technology apps specifically get students interested in writing.
Niles Technology apps do for the individual student what is impossible in the classroom – the apps use a proprietary reflexive query and answer protocol to get each student focused on his / her own ideas. Students learn that essays are not rote formats that contain the same things, but are really personal statements that are to be taken seriously. The apps teach the critical thinking and argument development of writing that are most crucial in developing good writing skills. Niles Technology apps are perfect one-on-one apps that motivate students to think and write better.
Parents, teachers and schools are discovering the efficacy of Niles Technology apps. Each month, school volume purchases of the apps increase, which indicates that the use of mobile technology in schools is becoming more commonplace. Parents are also catching on because Essay Workstation 3.0 is the No. 1 gifted app in the Niles inventory. Middle School Writing 2.0 and High School Writing 4.0 are virtually tied as solid No. 2 bestsellers, with both parents and schools purchasing the apps.
* iPhone, iPod touch, and iPad
* Requires iOS 4.3 or later
* 2.0 MB
Pricing and Availability:
Essay Writing Workstation 3.0 is .99 USD (or equivalent amount in other currencies) and available worldwide exclusively through the App Store in the Education category.
Niles Technology Group was founded in 2007 to develop software for emerging technologies and is developing a series of mobile computing applications dedicated to teaching superior writing and logical thinking skills. With its experience in the technology and content required to develop full-featured products for students, Niles Technology Group is a leading app publisher, and the Essay Czar, Achievers Writing Center and Essay Writing Wizard apps have sold successfully worldwide. The key to Niles Technology Group’s success is specificity. Each app is specific to the writing task at hand. Michael A. Niles, the founder, was formerly, for eight years, the President and CEO of The Right Education, Inc. (TRE), a web-based educational technology company that developed The Learning Accelerator. Copyright (C) 2007-2014 Niles Technology Group. All Rights Reserved. Apple, the Apple logo, iPhone, iPod and iPad are registered trademarks of Apple Inc. in the U.S. and/or other countries.
I love this video of Steve Jobs from 1997, answering a hard question in an open and honest way. This is what I want to see more of from all our technology companies. Not polish, not script, just great products that they obviously love and can’t wait to share with us, and a real dialog about the choices they made to get there.
One of the biggest features on the iPhone 5S is the fingerprint scanner on the home button. Sure fingerprint scanning technology has been seen before on Android smartphones, but Apple was the first to implement a reliable, accurate and fast fingerprint scanner in the iPhone 5S.
It’s obvious that many manufacturers are probably trying to catch up to Apple, however, the Korean Herald has quoted an unnamed Samsung official claims that Samsung is “not yet developing the technology”. The only Korean company known to be working on fingerprint scanners for smartphones is Crucialtec who has already implemented on of its models in the Pantech Vega LTE-A, which is an OEM that Samsung happens to own a stake in.
Researchers claim that Crucialtec is at least a year behind Authentec, the company which Apple bought a year ago and whose technology is found in the iPhone 5S, so this could be a sticking point for those hoping for a fingerprint scanner in the Galaxy S5.
While Samsung isn’t yet developing anything in regards to fingerprint scanning, that can change quickly if it turns out that consumers want the feature implemented. Now for those of you who absolutely require a fingerprint scanner on their smartphone, the upcoming HTC One Max is rumored to be featuring the technology.
Do you think fingerprint scanners are a worthwhile feature on smartphones or are you sticking with your password?
While we are still waiting for NVIDIA‘s impressive looking Tegra 4 processor to officially hit the streets and in smartphones, the company is wasting no time teasing what we can expect to see in the future. Not only are we getting a glimpse of the next-gen graphics we’ll likely receive with Tegra 5, but this shows us NVIDIA’s mobile Kepler GPU which we can expect to see in tons of devices next year.
Back in June NVIDIA announced they’ll be licensing out their impressive graphics chipsets and technology portfolio to 3rd parties, mainly for mobile. This means they won’t be sticking to simply creating chipsets like the Tegra 3 and Tegra 4, but also allowing others to license and use the powerful graphics inside said chips.
What this means is the true graphics power behind the Tegra 4 (and the upcoming Tegra 5) can be utilized by other 3rd parties and not just NVIDIA. Fast forward to today, and the green company is now giving us a glimpse of just how stunning and visually impressive that technology is. The same Kepler graphics platform powering the most powerful desktop PC’s has been streamlined into mobile. They’re calling this the Mobile Kelper GPU. It’s a bit confusing for average readers, so we’ll just let these demo videos do the explaining.
Project Logan is what we’ll likely call the NVIDIA Tegra 5, which could power smartphones, TV’s, tablets, and tons of other electronic devices. The demo above is showing the raw graphical performance that the mobile Kepler GPU inside Logan can truly offer. Essentially showing us that near PC quality top-end graphics will be arriving on the mobile scene in the near future.
The new mobile Kepler architecture is so powerful that NVIDIA decided a graph was in order. Showing its potential next to popular devices like the iPhone 4, Galaxy S II, and even the PlayStation 3. It’s safe to say that console-quality graphics (or similar) are coming to mobile devices once and for all next year. Thanks NVIDIA.
Obviously this is still a long, long ways away. In a way we’re a bit confused as to why they’re showing Tegra 5 type features before Tegra 4 has even hit the market on a mass scale. I guess it’s never too early to get the word out and show a products potential. Right? We’re excited to see how NVIDIA’s licensing and Mobile Kepler will shake up the industry in 2014.
A new report says that the Galaxy Note 3 could feature an improved camera capable to offer users optical image stabilization (OIS) features.
According to etnews, Samsung is apparently working on including OIS capabilities in the unannounced Galaxy Note 3, a feature that should help improve the pictures and videos taken with the phone’s main camera.
OIS is a technology that’s used in various cameras but also in some mobile devices to reduce image blur caused by motion when taking pictures and/or recording videos. In addition to OIS, the Galaxy Note 3 team has apparently also considered including 3x optical zoom capabilities in the phone’s camera, but it looks like that’s not an option for the handset, as it would affect the thickness of the device.
Modern mobile devices already offer software-based image stabilization features but OIS would actually deal with issues caused by motion right at the lens or sensor level.
Obviously, we’re treating such reports as rumors at this time, as there’s no way of confirming them just yet. There are plenty of other reports and rumors detailing the next-gen Galaxy Note model but Samsung is yet to unveil it. The handset is said to be announced at IFA in Germany, which is where its predecessor were also unveiled.
Since Samsung also makes plenty of digital cameras – including the Android-based Galaxy Camera – we’re not surprised that its teams are trying to improve the camera experience of upcoming smartphones, especially flagship devices. After all, it looks like smartphone makers are starting to pay more attention to the other features of a camera phone – see HTC’s UltraPixel and Nokia’s PureView technologies – other than megapixels, and Samsung will surely try to improve the cameras of its smartphones.
Japan Display Inc. (JDI) has unveiled its latest display technology, a 5-inch 1080p TFT LCD display with integrated touch functionality. Called “Pixel Eyes,” the display offers certain advantages including slimmer modules and lower optical reflections.
JDI is a joint venture between Sony, Hitachi and Toshiba, and its panels have already been used in Xperia Z handsets, however a problem noted by many reviewers was the poor viewing angles. JDI has noted that displays need to improve in some aspects and offer “higher resolutions, wider viewing angles, higher picture quality, lower power consumption, and thinner module thickness.”
The improvements this new technology will bring to the table include a 10 to 30% slimmer module, clearer picture quality and increased module brightness. At this point in time, it appears as if Pixel Eyes is not using the recently announced triluminos display tech.
When it comes to specs, the Pixel Eyes will feature a “transmissive IPS” display mode and offer a resolution of up to 1080p, 445 pixel per inch (ppi) density, 450cd/m2 brightness, contrast ratio of 1000:1 and a viewing angle of 160 degrees.
JDI says that displays in resolutions of 720p (720 x 1280 pixels) and qHD (540 x 960 pixels) will be mass produced starting with June 2013, with the 5-inch Full-HD displays being produced soon after. JDI will exhibit the display at the Society for Information Display (SID) Display Week 2013, in Japan this week.
Are you interested in this new display tech? Hope to see the screens in Sony’s next flagship devices?
The way we interact with our devices determines more than what we can do with them. Smartphones became popular not just because of their ability to connect us with the rest of the world, but also because of the connections we built with them, thanks primarily to touch input. Just as the mouse accelerated portable computing, touch input accelerated the growth of smartphones and tablets. When Steve Jobs walked up to announce the iPhone, he spoke of the connection we were about to experience thanks to touch input.
We’re going to use the best pointing device in the world. We’re going to use a pointing device that we’re all born with – born with ten of them. We’re going to use our fingers.
He was right too, you’re more likely to use a device that you feel connected to, rather than one that makes you feel alienated. A keyboard and mouse seem so disconnected, but actually touching the screen of your phone brought a physical connection to the table. Apple may not have been the first company to use touch as an input method, but it was certainly the most successful. Accelerate to the year 2013 and a new trend is appearing in the form gesture-based interaction, showing us just how quickly technology adapts and changes.
But can gesture-based input methods emulate or even surpass the connections we feel with touch input? What sort of applications will gesture control bring? Read on, as we take a three-dimensional adventure into the world of gesture based interaction.
Possibly the most popular form of gesture-based input is Kinect. It’s easy to forget that Kinect is only a little over two years old and in that short period of time, Microsoft has sold over 24 million units.
Kinect utilises an RGB camera, a depth sensor and multi array microphone, allowing it to provide full-body 3D motion capture, as well as voice and facial recognition. For a good look at just how Kinect works, check out the video below:
Kinect has some wonderful applications away from gaming and Skyping, especially in the field of medicine. Researchers at the University of Minnesota have used Kinect to measure disorder symptoms like autism and obsessive-compulsive disorder, in children. Kinect’s potential is sure to expand as more developers jump on board and with the Xbox 720 coming soon, the Kinect 2 may just be on the way too.
Smaller than an iPhone and thinner than a Macbook Air, Leap Motion is a nifty gesture based device that plugs into your PC via USB, and attempts to bring the desktop back into the 21st century.
Using hand gestures, you are able to control your PC just like you would with a touchscreen or a mouse, but what is revolutionary about this product is that the gestures are based on actions we do in everyday life. If you’re in the mood to transform your room into Hogwarts, check out the demonstration video below:
Mum’s the word when it comes to the exact technology embedded within the Leap controller, but what the developers will tell us is that it can track in-air movements down to 1/100th of a millimeter, meaning it is 200 times more sensitive than Kinect. WOW!
Leap Motion also has a few big names signed up to use its technology, with ASUS and HP pairing up with the company to bundle the technology in their PCs. Leap Motion also plans to bring the technology to tablets and phones, so I’m definitely holding my breath. With it’s ability to sense multiple fingers, hands and objects, Leap has an incredible future ahead of it. The implementations are endless, from the boardroom table to the emergency room, the future is bright.
Samsung and all the S-(insert name here) stuff
Samsung has shown an incredible amount of interest in gesture-based interaction, beginning with the Samsung Galaxy S2 and becoming an ever present feature in the Galaxy S3 and Galaxy S4. Gesture input was even ported to Samsung’s Smart TV line up. What began as a simple “turn to mute” gesture, turned into an asphyxiation with gesture based interfacing that was heightened when the S4 was announced.
If the plethora of camera features weren’t enough to satisfy your insatiable hunger, than the over abundance of ways to control the Galaxy S4 were sure to calm your senses. The features from the S2 and S3 remained, but they were taken to new levels, with “Air View” and “Air Gesture”, proving that you didn’t even have to touch your phone to interact with it. Perfect for those countless times you’ve had suntan lotion, or juicy ribs sauce slathered on your fingers. Check out Samsung’s Galaxy S4 advertisement below, if you’re not truly convinced that “Air Gestures” are the future of mobile interaction.
The technology game is a fast moving business and Samsung isn’t resting on its laurels, so it has already began developing a method of interaction using nothing but your mind. This could help people with disabilities better interact with their phones and give them better access to the internet. If you want to learn more on how Samsung is planning on transforming us all into Professor X, check out the full article here.
A major difference between SixthSense and other gesture-based technologies, is that its goal is to merge the physical and digital world into one. What began as a simple contraption using nothing but the rollers in a mouse and some pulleys, has transformed into a neck worn pendant, complete with a projector and a camera.
SixthSense allows you to convert a paper map into a digital one, transform a piece of paper into a tablet and pull information of pieces of paper and into your computer. Through gestures SixthSense can take photos, zoom in or pan on a map and even transform your wrist into an analog watch. Check out founder Pranav Mistry’s TED talk for a complete look into the fascinating technology.
Gesture-based interaction is here to stay. Interacting with your devices in 3D space is a special, almost surreal kind of feeling and the applications for gesture based input are limitless. With brilliant contraptions like Leap Motion and SixthSense, the future looks dazzlingly bright for gesture-based input.
Do you ever use Kinect on your Xbox? How about “Air Gestures” on your Galaxy S4? Interested in Leap and SixthSense? Let us know in the comments below.