Monday, 29 December 2008

AI 167 MS&V is regarded as high growth sector within the current economic climate


The relatively new industry of modeling, simulation and visualization (MS&V) comprises numerous planning, analysis and training tools made possible by sophisticated computing. These tools can suggest and test concepts, minimizing reliance upon trial and error, and they can present information in ways that enhance comprehension.

For example, the tools might teach a medical student how to perform a surgical procedure without putting an actual patient in harm's way. Other applications are seen in simulations to test aircraft designs, in vehicular traffic models to simulate — and improve — flow along highways, in video games to teach algebra, and in models to predict the performance of a soldier or an athlete. Artificial intelligence, robotics and virtual environments also are part of MS&V.

Tuesday, 16 December 2008

AI 166 Big Stage launches a “Portable You” avatar for integration into 3rd party solutions


Big Stage Entertainment (for previous references go to AI 129 and AI 56) announced today the launch of its "Portable You" program, which opens up its 3D facial modeling system to third parties for integration in games, virtual worlds, websites, mobile apps, and more.

The first announced partner is Icarus Studios, which will integrate the system with its virtual worlds’ platform.

Virtual event solutions provider, The Venue Network, already plans on adopting the face creation tools for its customers, allowing them to network with their own animated, lip-synched faces.

“Given the complexity of human faces, developing high-fidelity, realistic facial construction systems for avatars is incredibly costly and requires highly specialized skill sets,” said James Hettinger CEO, Icarus Studios. “PortableYou” offers an easy and cost-effective way for us to integrate sophisticated cloning capabilities into the virtual worlds we create. Our clients are very excited that their users can now create accurate virtual representations of themselves for use in their worlds, at development costs that generate significant ROI for incorporating the innovation in user experience that the PortableYou system make possible.”
Portable You includes APIs, code samples, methods and reference libraries for customization.

“By offering a powerful, unified system for the integration of a realistically animated 3-D version of yourself into your digital life, PortableYou has the potential to revolutionize how we both entertain ourselves and communicate through digital media,” said Phil Ressler, Big Stage Entertainment CEO. “With the high level of personalization that PortableYou makes possible, digital media can evolve to truly reflect who we are in the real world, whether through life-like extension or alter ego fantasy. Faces are fundamental to human communication and adding recognizable, animated avatars to an application changes it fundamentally."

Friday, 12 December 2008

AI 165 AI Inventor creates his own robotic wife and he is pleased because she has 13,000 dialogue sentences and can feel pain!


Le Trung , an inventor in Canada has created his very own fembot called Aiko.
The living doll's skin has been created out of silicone and, according to its creator, "is the first android to mimic pain, and reacts to it."
Trung has created the 'pain' technology to aid others, explaining: "this technology can be beneficial for people born with or who have undergone amputations. This is the first step toward a life-like mechanical limb that has the ability to feel physical sensation."

To make the robot more lifelike, Trung has created something called Biometric Robot Artificial Intelligence Neural System (aka BRAINS) software, which means Aiko has the ability to talk and interact with humans, and houses a database of over 13,000 sentences.

The robot is so advanced that it can analyse the weather and if you are about to go outside, Aiko will tell you to bring an umbrella if it is going to rain or wear warmer clothes if it is windy.

Wednesday, 10 December 2008

AI 164 Study states that by 2020 social robots will be common place in our society with personalized interactions with humans


Spanish researchers have carried out a study looking into the potential future impact of robots on society.

Their conclusions show that the enormous automation capacity of robots and their ability to interact with humans will cause a technological imbalance over the next 12 years between those who have them and those who do not.


“Just as we depend upon mobile phones and cars in our daily lives today, the next 15 years will see mass hybridisation between humans and robots,” predicts Antonio López Peláez, a professor of sociology at Spain’s National Distance Learning University, UNED, and co-author of the study on the future social impact of robots, jointly carried out with the Institute for Prospective Technological Studies.

International experts working on inventing and adapting cutting edge robots for practical use were interviewed during the study, in order to find out by when we will be regularly using the models they are currently designing.

All agreed on 2020 as a technological inflection point, because by then robots “will be able to see, act, speak, manage natural language and have intelligence, and our relationship with them will have become more constant and commonplace”, said López Peláez.

This will follow a revolution in robotics after which they will no longer be sophisticated machines, but tools to be used on a daily basis, helping us with a large number of work and social activities.


The most striking feature of this technological revolution are social robots, machines with artificial intelligence, and with which we will have emotional and personalised interactions.

“A robot might be a more effective partner and a better person than the humans we actually have in our immediate lives: just as you can see dog owners talking to their pets today, soon we will be talking to robots,” says López Peláez

AI 163 HiPiHi continues to grown the Chinese virtual world market with 75,000 registered users


Chinese virtual world HiPiHi now claims 75,000 registered users on mainland China, said President Xu Hui in 21st Century Business Herald. While the total number of registrations is likely even higher, as HiPiHi has pursued an international strategy, though not particularly aggressively, its Mainland base contribute about 3,000 to 4,000 active users.

According to the report, HiPiHi is also working with the city government to build versions of historic Wuhan in the Hubei province.

Tuesday, 9 December 2008

AI 162 EKI One ai middleware now launched for avatar behaviour for authentic character simulation


Germany-based middleware developer Artificial Technology has released the first version of its AI and emotional intelligence middleware solution - EKI One - across Europe, with free trials being made available. We covered this as emergent technology before – please refer to AI 51.

EKI One 1.0 is designed to bolt onto existing game engines and aims to add depth to titles by giving developers an effective, affordable solution to AI and emotion.
"With EKI One 1.0, we proudly present a software solution that allows game and level designers, script designers and programmers to define character behaviour efficiently and with ease from cognition and movement characteristics to intelligent decision-making," said Serein Pfeiffer, technical director and co-founder of Artificial Technology GmbH.

"The specifications and requests we received from our developer partners have had a direct bearing on the development of EKI One 1.0. We have combined ease of use with the technological depth required for authentic character simulation in a single, well-rounded package."

This is commercially ahead of the work being done by The University of the Balearic Islands (refer to AI 161).

AI 161 Smart responsive facial expressions for avatars in 3d web and virtual worlds has been developed


The University of the Balearic Islands (UIB) has developed a computer application that enables the generation of faces that includes emotions and moods.

The study has been published in the latest edition of the magazine Computer Animation and Virtual Worlds.

“The aim of this work has been to design a model that reveals a person's moods and displays them on a virtual face”, SINC was informed by one of the authors of the study, Diana Arellano, from the UIB’s Computer and Artificial Intelligence Graphics and Vision Unit. “In the same 3-D space we have integrated personality, emotions and moods, which had previously been dealt with separately”, Arellano explained to SINC.

The designers have followed the theories of Albert Mehrabian to draw up the model, based on the five personality traits established by this American psychologist:

• Extraversion

• Neuroticism

• Openness

• Conscientiousness

• Agreeableness


“Every personality can be considered an emotional state by default”, indicated Arellano.

An introverted and neurotic personality is therefore related to an anxious emotional state.

The points of the face that define these emotions can be determined mathematically, and the algorithms developed by computer experts can be used to obtain different facial expressions “quickly and easily”.

The system, which uses the MPEG-4 video coding standard for creating images, makes it possible to display basic emotions (anger, disgust, fear, joy, sadness, surprise) and intermediate situations.

“Our next step is to leave the MPEG-4 standard aside and concentrate on a high-quality generic network which will enable the inclusion of both wrinkles and eye, eyelid and head movements, as well as synthesize the voice”, the researcher concluded.

Wednesday, 3 December 2008

AI 160 Growth of layered avatars for 2d web sites continues to grow rapidly using conversational marketing and new forms of socialization


2d web sites are creating or attracting avatars for conversational marketing and new forms of socialization.

2d web avatars technologies include:

• Clivevideo http://www.clivevideo.com/

• Rocketon http://rocketon.com/

• Weblim http://www.weblin.com/

Also more technology is emerging to support avatars with ai such as

• Natural Language from LiveWorld’s Livebar http://www.liveworld.com/solutions/livebar.html

• Scripted Language from Decisionality www.decisionality.com

The combination of 2d web avatars and chat engines brings benefits of

1. Brand stickiness
2. Personalization thru conversational marketing
3. Product placement
4. Interactive edutainment
5. Interactive puzzles
6. Interactive books
7. Interactive platform games

Tuesday, 2 December 2008

AI 159 USA Army to use Olive for immersive learning


Just today, Forterra announced a contract to integrate with Army technology for simulations and collaboration.

The Army trains 500,000 Soldiers every year and there is a significant diversity of learning abilities among those 500,000 human beings. What we are after is to develop better ways of training them so they can all be trained up to certain skill levels," Dr. John Parmentola, director for Army research and laboratory management told Nanotechwire.com of neuroscience research into training.

The goal is also to use virtual environments that builds on that research. "That's about creating virtual worlds that are essentially indistinguishable from reality. One of the key challenges we have in that area is to create virtual humans. Of course the work of neuroscience plays a key role in trying to create virtual humans that, for all intents and purposes, act and interact just like humans."

For more information please refer to AI 78.