Here you can find all about Bauer Dino-mite like manual and other informations. For example: review.
Bauer Dino-mite manual (user guide) is ready to download for free.
On the bottom of page users can write a review. If you own a Bauer Dino-mite please write about it to help other people. [ Report abuse or wrong photo | Share your Bauer Dino-mite photo ]
Bauer Dino-mite - Version 1.0, size: 1.5 MB
User reviews and opinions
No opinions have been provided. Be the first and add a new opinion/review.
An Interactive Autonomous Musical Robot
Creating Music with the Ugobe Pleo Andria Poiarkoff
Sonic Arts Research Centre Queens University Belfast Belfast, BT7 1NN firstname.lastname@example.org
As an exploration of autonomous musical robotics, the Ugobe Pleo has been used as a platform for experimentation. The robotic dinosaur has been researched, studied, dissected, augmented, rebuilt, and connected via serial interface to a digital signal processing application. The goal of the project is to utilize the Pleos own actions to allow the autonomous creation of a soundscape, while allowing interactive participation and manipulation through the robots external sensory system. Keywords: autonomous robots, robot music, musical interfaces, interactive systems, human-robot interaction
2. Default Configuration
All PLEO robots contain the same hardware and software configuration, though as the standard programming allows PLEO to develop an individual personality, after a period of time different PLEOs will show different actions and abilities. 2.1 Sensory System Interaction with PLEO is based on a system of sensors: - 8 touch sensors: head, chin, back, rear, and one on each leg - Tilt/motion sensors which detect change in position - IR proximity sensor located in the nose - Color camera located in the nose - IR interrupter located in the mouth - Binaural microphones - 4 force sensors, one on each foot The sensors in the PLEO robot are programmed for specific interactions. Each touch sensor triggers a behavior or behavior set. The tilt and motion sensors allow PLEO to respond to being held in unnatural positions, such as upside-down, and to call for help when it has fallen over. The IR sensor in the nose is only functional within a distance of approximately 3 to 4 inches and is used to allow the robot to navigate around objects as well as to react to objects placed in front of it. The color camera, while holding great potential, is only programmed to react to particular shades of green and yellow. These colors are used in PLEOs leaf toy, and it will respond to other objects of the same color. The IR interrupter located in the mouth is used to detect objects in the mouth, which then triggers behaviors such as eating or tugging. PLEOs binaural microphones serve only the purpose of allowing PLEO to respond in the direction of a sound. For instance, if one were to talk to the left of PLEOs head, it may turn its head to the left. If a loud sound is made, such as a clap, it will turn its head away. PLEO also has force sensors in its feet, partly for the purpose of keeping it from walking off of surfaces, but also to allow for interactions such as high five.
The field of musical robotics has existed for decades, from devices such as the player piano and metronome-based percussionists  all the way to the robotic instrumentalists of LEMUR.  While musical robots exist , even autonomous musical robots such as those of LEMUR, very few systems combine together autonomy and human interaction. The use of robotic applications as tools for investigation of interaction or relevant behavior in general is not established within music research. There are, however, applications, which can be considered relevant for topics currently discussed with respect to music-related behavior, such as the use of gestures and more general bodily movements. However, when it comes to gesture and movement, most musical robots rely on standard mathematical approaches to music creation, and while they function autonomously, the sound is based entirely on strict algorithms.  The Sony Aibo has been used for musical applications, such as a project creating sound in response to emotion , and another in which the Aibo recognizes tempo and percussive sounds . The Aibo was full of technology that could be put to use in interactive applications, however its cost and later discontinuation left it an inefficient platform for development. The goal of this project is to utilize the Ugobe PLEO as an accessible platform, developed to function as an interactive autonomous musical robot.
2.2 Mechanics PLEOs movement relies on 14 servos and force feedback sensors. The motorized joints include the neck, tail, waist, mouth, eyes, and two in each leg. PLEO has three default poses: standing, sitting, and sleeping. From a physical perspective PLEO is able to walk, dance, talk, wag its tail, and respond in a manner not unlike a small dog. Because of its internal motion sensors, PLEO is able to respond physically to force and acceleration. For instance, if one were to place PLEO on a skateboard and push it forward, PLEO would compensate for the movement and balance itself. In the same manner, if one places a finger in PLEOs mouth and tries to pull away, it will play tug of war. PLEOs mechanics do have some limitations. The servos controlling the legs and mouth are quite slow and can easily be overworked when programming new motions. In addition, some motions cannot be done at the same time because of shared servos. For instance, PLEO cannot open its mouth while its eyes are closed. 2.3 Controls PLEO contains 6 separate processors. The main processor is an ARM7 32-bit processor, which handles most of PLEOs input as well as SD, USB, and Flash data. This can be considered the brain of the PLEO robot. There is a separate ARM7 32-bit microprocessor in PLEOs head that handles input from the microphones, camera, and IR sensors. There are four additional 8-bit processors and two Dual H-Bridge chips that control the motors. 2.4 Operating System PLEO runs an operating system called Life OS. Life OS is written in Pawn, a simplified version of the C programming language. The operating system is based on pseudo-AI, progressing through three stages to develop a personality based on experience. When a PLEO is first turned on it has the behaviors of a baby, sleeping regularly, cooing, and desiring of affection. The next stage involves PLEO beginning to clumsily explore its environment. Interaction during this stage will serve to impact the eventual personality of the PLEO. Once its personality has been developed, PLEO will explore on its own as well as learning new actions. Currently Life OS is closed source. A PLEO Developer Kit was due to be released in early 2009, however with the collapse and following liquidation of Ugobe, it is unlikely the kit will ever be made public.
of time, a concept similar to computer animation. Audio files can be used alongside the motions, and a 3D real-time representation of the movements allows one to preview the behavior without having to write to an SD card. YAPT (Yet Another Pleo Tool)  was released by the AiboHack group and is used to create personalities for PLEO. It is not a powerful tool, but can be combined with custom behaviors to develop different personalities. There is also a program called Dino-Mite (Dinosaur Monitor w/ Integrated Executables)  released by BAUER Independents Ltd. that allows one to receive and monitor data from PLEOs various inputs. 3.2 Resources As the PLEO is relatively new, and now discontinued, resources for the robot are predominately composed by amateur robot enthusiasts and only found online. Most information is found on forums and hobby sites. AiboHack  contains a large amount of detailed software and hardware information. Sites such as howstuffworks , iFixit , and Robostuff  contain articles relating to the release and specifications of the PLEO as well as usersubmitted projects and hacking tips. A forum from Bob the Pleo  is an excellent community containing numerous threads on technical problems, hardware and software, and projects in development. GRIP, the Group for Interdisciplinary Psychology in Germany, is an academic group attempting facial recognition and emotion sensing with the PLEO. Their resources provide a step-by-step manual to upgrading the robots camera and give insight into PLEOs visual system and what potential it holds.
4. Exploration and Development
4.1 Hardware Exploration Exploring PLEOs hardware system involved careful dissection. The robots skin was removed by separating the glued seams, then cutting around areas that were attached to the plastic skeleton. Once this was accomplished, PLEOs outer shell (two large pieces creating the robots back) was removed. Underneath the shell are additional shields preventing one from damaging the complex servo system. The next item removed was the skull. Underneath PLEOs skull lies a large amount of vital circuitry, including that controlling the eyes, mouth, camera, IR sensor, front speaker, and binaural microphone ears. The next step involved carefully separating PLEOs torso joint. Once this was accomplished one could observe the main processors. PLEOs speakers were replaced with larger, higher quality speakers to alleviate the roughness of the sounds produced. The robots microphones were replaced with slightly larger, more sensitive microphones. While the speaker replacement did not result in much improvement, the effects of the new microphones wear instantly visible.
3. Tools and Resources
3.1 Third Party Software Numerous programs have been developed to customize the PLEO robot. MySkit  is a program developed by DogsBody that allows one to build behaviors using a motion sequencer, programming each servo over a period
While PLEO previously only responded to loud sounds such as a clap or sharp command, it now focused curiously in the direction of my lab mates as they spoke at a normal conversational level. 4.2 Software Exploration After experimenting with the three discovered third-party applications, software exploration with the PLEO robot was initially done through the use of MySkit via SD card. Movements and sound were combined and sequenced into complex actions. These actions were then programmed as responses to particular sensor stimuli. While vital in gaining an understanding of how PLEOs mechanical, audio, and personality systems work together, the programs created in MySkit were effectively replacing the original embedded program and as such removed PLEOs overall autonomy. It was decided that access to PLEOs system would need to be through a means other than the autonomy-destroying SD card. PLEO has a built-in USB port, which is utilized by the Dino-MITE program, however the PC to which the robot was connected had difficulty recognizing the port. Accessing PLEOs additional serial port was attempted as detailed in in the article How to Control Pleo Wirelessly with Wii Nunchuk.  A cable was created as per the instructions, however while the PC recognized the serial port, there was no data stream. After numerous attempts at accessing both ports, it was realize that the issue was not with the robot itself, but with the computers own serial drivers. PLEO was eventually successfully connected via USB port to a MacBook Pro using CoolTerm , a freeware serial port terminal application. 4.3 Development Once an understanding of the hardware and software capabilities had been gained, it was time to move on to finding ways of utilizing the PLEO as a platform for development. PLEOs system is designed to accept text commands. Connecting the robots serial port through CoolTerm allowed access to the internal system. Using a simple help command generates a screen of the comprehensive workings of PLEO. Through this interface one is able to initiate behaviors, actions, and sounds, print details of the robots current status, load programs, and most importantly access the data streams for every one of PLEOs sensory inputs. These data streams could observed for each sensor independently, or for all sensors, in realtime. At this point PLEO had developed from a partially hacked robot into a fully accessible hardware interface. An unexpected problem occurred at this point. PLEO is made to accept software updates via USB, and because of this and the fact that removing the robot mid-update could destroy its operating system, it has a safety mode that causes it to cease physical activity while the USB is connected. In order to keep PLEO responding as its normal self while connected to the computer, the entire operating
system was copied to an SD card. The SD card then serves as the primary boot drive, and PLEO is back to its active self.
5. Computer Interfacing
5.1 Interfacing with PLEO The musical portion of this project utilizes Max/MSP (detailed in Appendix IV. Max/MSP Development). The first challenge in creating an interface to work with was creating a process by which Max/MSP would recognize the PLEO as a serial interface. The port itself was identified immediately by the program, as is expected for any standard USB device. However, the complexities emerged when communication with PLEO was attempted. PLEO responds only to its own set of text commands, and no data could be streamed from the robot until the robot was given a command to output the data. While this task was originally regarded as being very complex, after large amounts of unsuccessful work it was realized that the elusive solution was actually quite simple. One had only to create a message in Max/MSP, send that message character by character, and send a return command. Signal Routing The PLEO robot has 40 sensor outputs plus a further 16 for joint feedback. As such, sending a single monitor enable all command results in a real-time data stream of the status of 56 separate sensors. These streams needed to be converted into usable data. In addition, while the data from all 56 sensors came in a single block, the joint and sensor data were broken into two lines preceded by text. The first step was convert the data from integers to ASCII. From there the ASCII symbols were separated into individual messages. At this point the data could be routed based on their text prefixes (joint and sensor). The joint and sensor data was then handled and unpacked separately, so that the data was now arriving in two groups, one of 16 and one of 40. 5.2 Creating the PleoPatch The PleoPatch is a Max/MSP patch in the form of a simple interface that allows monitoring of the real-time status of all 56 sensors. The entire process detailed previously requires one to do nothing more than choose a port and click start. This patch can be easily turned into an object (or two, to keep the sensor categories separate) to give instant access to PLEOs data streams in further patches.
6.1 Utilizing Data Streams PLEOs sensors output data in different manners based on what they are being used for. The data from the joints is sent in degrees, positive or negative, based on the possible range of motion of that joint. The neck can horizontally move from -90 to 90, the shoulders from -30 to 30. The
external sensors only read charge, and therefore only output a 0 or 1. 6.2 Mapping Due to the constant data stream being received (for example, a constant string of 1 when touching PLEOs head), it was essential to gate all of the data to prevent incessant multi-triggering. The mapping for this project was created in such a way that a signal meeting a positive condition results in the closure of that signals gate, cutting off the data though-put until the signal itself changed in value. The joint signals are used to trigger a sound or effect when the output changes to a positive value, and the sensor signals trigger based on positive activity. 6.3 Creating the Soundscape The patch can be used to map any sound to any data stream, but for the purposes of this project the sensors were used to create atmospheric sound, and the joints were programmed with either samples of a single note or a sample of a series of on a single instrument. The main sensors on the back and head were used for special effects, such as reverb and ptichshifting. The result was a complex, and at times surreal, prehistoric stormy soundscape overlaid with single bells and wind chimes. 6.4 Interaction and Experience Interacting with the musical PLEO is a unique experience. The PLEO itself, in its autonomous behavior, will create a soundscape whether it is interacting with itself, its environment, or external stimuli. The installation could run without the aid of human participants. However, actively interacting with PLEO adds another dimension to the piece. It can trigger sounds that PLEO normally wouldnt create on its own, and alter the sound through effects programmed into particular sensors. These interactions allow a participant to play with the robot while actively sculpting the soundscape.
partial dissection of the robot allowed insight into how it physically functions, but also exposed the fragility of the mechanics inside PLEO. Even working with the few parts that were replaced in the experimental stages led to an incessant worry that something would be accidentally broken. The only major setback was in gaining access to PLEOs serial port, and it would be advised that one allows time and patience for this aspect of PLEO hacking. If it simply will not work, it is probably an issue with the serial drivers within the computer. Handling the data streams could be made more convenient by breaking the groups into further subsections, such as a group for the foot sensors, a group for the leg sensors, etc. If one were to record the possible degree movements of the joints, they could be mapped accordingly and used to manipulate data rather than trigger it. The overall presentation of the project was very unique and the interaction extremely intuitive. It only takes a few seconds to realize how movement and touch within the robot are used to create the soundscape.
As an example of an interactive autonomous musical robot, PLEO combined with Max/MSP created a musical experience like none other. Robotic instruments, musicans, and composers are becoming commonplace, but most often function either entirely autonomously or entirely based on human input. The musical PLEO combines these two aspects to create an experience and sound that is both fun and sonically appealing. The PLEO is an excellent platform for autonomous, interactive, and in this case sonic experimentation.
The developmental process of PLEO as a musical robot, while faced with numerous complexities, furthered the knowledge and possibilities of this particular dinosaur. The
 Burger, B. & Schmidt, L., 2009, Considerations Concerning a Methodology for Musical Robotics and Human-Robot Interaction, 7th Triennial Conference of European Society for the Cognitive Sciences of Music.  Jee, E.S., Kim, C.H., Park, S.Y. & Lee, K.W., 2007, Robot and Human interactive Communication, 2007. RO-MAN 2007. The 16th IEEE International Symposium on, Composition of Musical Sound Expressing an Emotion of Robot Based on Musical Factors. pp. 637-41.  Jord, S., 2001, Proceedings of the 2001 conference on New interfaces for musical expression, New musical interfaces and new music-making paradigms. pp. 1-5.  Kapur, A. & Singer, E., 2006, Proceedings of the International Conference on Music Information Retrieval, A retrieval approach for human/robotic musical performance. pp. 363-4.  Kapur, A., 2005, Proceedings of the International Computer Music Conference (ICMC), A history of robotic musical instruments.  Miranda, E.R. & Tikhanoff, V., Musical Composition by Autonomous Robots: A Case Study with AIBO, Proceedings of TAROS 2005 (Towards Autonomous Robotic Systems).  Schmidhuber, J., 2006, Developmental robotics, optimal artificial curiosity, creativity, music, and the fine arts, Connection Science, 18(2), pp. 173-87.  Singer, E., Feddersen, J., Redmon, C. & Bowen, B., 2004, Proceedings of the 2004 conference on New interfaces for musical expression, LEMUR's musical robots. pp. 181-4.  Suzuki, K., Ohashi, T. & Hashimoto, S., 1999, Proc. of International Computer Music Conference, Interactive Multimodal Mobile Robot for Musical Performance.  Tejada, S., AI, AIBO and ART: Inspiring Interaction with Computer Science,.  Vallis, O., Hochenbaum, J. & Kapur, A., 2008, Tenth IEEE International Symposium on Multimedia, 2008. ISM 2008, Extended Interface Solutions for Musical Robotics. pp. 495-6.  Weinberg, G., Raman, A. & Mallikarjuna, T., 2009, Proceedings of the 4th ACM/IEEE international conference on Human robot interaction, Interactive jamming with Shimon: a social robotic musician. pp. 2334.
 Andrey 2009, How to Control Pleo Wirelessly Using Wii Nunchuck, Robostuff. from http://robostuff.com/diy-projects/pleo-hacking/how-tocontrol-pleo-wirelessly-using-wii-nunchuck  Group for Interdisciplinary Psychology, Hacking Pleo: Working with the Pleo Platform, from http://www.grip-online.com/en/pleo_hack  Pleo Hardware and Software Forum, from http://bobthepleo.com/forums/index.php  Pleo Teardown, iFixit. from http://www.ifixit.com/ Guide/First-Look/Pleo/597/1  Pleo Technical Info, AiboHack. http://www.aibohack.com/pleo/index.html from
 Wilson, T., How Pleo Works, howstuffworks. from http://science.howstuffworks.com/pleo.htm
 BAUER Independents Ltd, Dino-MITE [computer file]. http://www.bauerindependents.com/SUBMAIN/dinomite. htm [distributor].  DogsBody, MySkit Performance Editor for PLEO [computer file]. http://www.dogsbodynet.com/myskit/index.html [distributor].  YAPT: Yet Another Pleo Tool [computer file]. http://www.aibohack.com/pleo/yapt.htm [distributor].  CoolTerm [computer file]. http://freeware.themeiers.org/ [distributor].
Multimodal Programming Environment for Kids: A Thought Bubble Interface for the Pleo Robotic Character
Kimiko Ryokai School of Information Berkeley Center for New Media University of California Berkeley Berkeley, CA 94720 USA email@example.com Michael Jongseon Lee School of Information University of California Berkeley Berkeley, CA 94720 USA firstname.lastname@example.org Jonathan Micah Breitbart School of Information University of California Berkeley Berkeley, CA 94720 USA email@example.com
We introduce a mixed physical and digital programming environment for children to control robotic characters. We present our design rationale, our initial prototype, report the results from our initial evaluation, and discuss ongoing work.
Tangible, programming, robotic toys, children
ACM Classification Keywords
H.5.1 [Information Interfaces and Presentation]: Multimedia Information Systemsartificial, augmented, and virtual realities.
Robotic toys and animatronics are gaining popularity and starting to be widely available on a consumer level (e.g., by WowWee, Sony, Hasbro, Omron). Ugobes Pleo  is one such consumer level robotic dinosaur toy targeted for children aged 8 and up. Pleo is designed to be a friendly and curious baby dinosaur that exhibits a variety of life-like movements. Out of the box, Pleo responds to touch and gives an impression of learning by reacting to an individual
Copyright is held by the author/owner(s). CHI 2009, April 4 9, 2009, Boston, MA, USA ACM 978-1-60558-246-7/09/04.
owner in unique ways. However, in reality, Pleo is not equipped with any learning mechanism but simply runs complex combinations of canned responses that emulate intelligent behavior. Our informal observations of children (age 5-10) playing with Pleo showed that children cuddled with Pleo, and like with a real pet, they wanted to teach Pleo special tricks. Instead of passively responding to what is already programmed into Pleo, we want to give children the opportunity to actively create and control Pleos behaviors. Pleo is an open source platform, allowing technically capable hobbyists to customize and program their original behaviors beyond the preprogrammed actions
figure 1. UGOBEs robotic baby dinosaur, Pleo.
(e.g., singing original songs or performing customized dances). However, in order to produce such customized expressions, one needs a relatively high level of technical competency (e.g., knowledge of the C programming language and PAWN scripting). In this regard, our goal is to create an environment that allows children to easily program and control the creatures behaviors.
Pleo Thought Bubble: Combining the Physical and Virtual
During our informal play sessions with an out-of-thebox Pleo, children noticed that Pleo was doing something in response to their physical touch. However, it was not so clear to the children which part of their touch was recognized by Pleo, and what Pleo would do in response. Therefore, we wanted to give children real-time access to what goes on in the robotic dinosaurs head so that they could better understand the process as well as what behaviors they can change or control. Illuminating the thought process: Thought Bubble The metaphor we built on is a thought bubble of the robotic character, through which children tap into the thought process of the character. We use a touch screen in combination with Pleo to show what is happening to Pleo in terms of input (how Pleo is touched), output (what Pleo does in response), and memory (learned pairs of input and output), in real time.
figure 2. Combining the Physical and the GUI programming using the thought bubble of the robotic character as a metaphor.
PROGRAMMING SEQUENCES Learned behaviors are saved in Pleos memory bank (see figure 4). This memory bank serves as a repository showing which tricks Pleo has been taught. The left column labeled me represents input, i.e., what the user does (e.g., touching Pleos chin, head, legs, back, or tail) and the right column labeled pleo represents output, i.e., what Pleo does in response to the input (e.g., singing, wagging tale, stretching, etc.) Touching a body part associated with the top-most pair will cause the last learned trick to execute.
figure 3. Touch screen interface to Pleos Thought Bubble
figure 4. A close up of the memory bank showing some behaviors have a conditional statement (e.g., when the tail is touched), and some have not.
MONITORING THE STATUS AND POSSIBLE ACTIONS Touching different body parts of the robotic Pleo immediately highlights the chosen area of the Pleo image on the screen and shows available actions tied to that area (see figure 3). At the same time, Pleo starts to physically perform the listed actions in sequence. For example, if Pleos chin is touched, Pleo goes through purring, tail wagging, mooing, and singing in sequence, as long as the chin is being touched. These actions performed by Pleo are also highlighted on the thought bubble screen. The desired behavior may be positively reinforced by either feeding the robotic Pleo its physical leaf, or by pressing the virtual leaf icon on the touch screen interface. For example, if Pleos singing behavior while being scratched on its chin is rewarded with the leaf, Pleo associates the chin scratch with the singing behavior. Therefore, the next time Pleos chin is touched, Pleo associates that input as the cue to start singing.
The items on the left me column can be removed by pressing the trashcan icon. Once the input part of the pair is removed, the action is automatically played directly after the preceding pair. For example, figure 4 shows the topmost pair in the queue would initiate by touching the tail. If the tail is touched, Pleo will start to sing. Immediately after singing, Pleo will wiggle his right leg, as the me column is missing and Pleo does not have to wait for any input from the user. After wiggling its right leg, Pleo will wait for the user to touch its back to execute the next pairing. Using the trashcan tool on a lone action sequence will "undelete" the associated body part, returning the pair to its original state. As such, the memory bank allows basic conditional (procedural) as well as sequential behavior programming. Combining Physical Interaction and GUI We wanted to allow multiple entry points to interaction with the robotic toy by providing both physical and virtual interfaces. The child may choose to program 1) with the physical toy only, ignoring the screen interface altogether and focusing on physical interactions with Pleo, 2) on the screen interface only, directly
controlling Pleos behaviors through the GUI only, or 3) using a combination of the physical interface and GUI.
A variety of programming environments for robotic creatures have inspired our work. Crickets  and commercially available LEGO Mindstorms  are systems of physical LEGO blocks, sensors, actuators, and programming environment that allow children to create their own programmable robotic creations. They invite creators to move between the physical world of model creation with blocks and the virtual world of programming. Topobo  is a new construction kit with kinetic memory that invites young children to build 3D creatures and program their movements by directly twisting and turning the physical model. Guo and Sharlin presented a system that allows a person to control a robotic character, Aibo, via Nintendo Wii game controllers . By combining the physical and the virtual, our approach is to have children decide where to focus their actions and allow them to easily move between physical interaction and virtual control.
and sounds. In identifying actions to include, we used the Dino-MITE  and MySkit  software applications. Dino-MITE allows monitoring of Pleo's COM port connection and joint positions, as well as sending commands to Pleo. Dino-MITE can also list Pleo's builtin behaviors. MySkit is a performance editing program that allows users to create "skits" by manipulating Pleo's joint positions. The behaviors we chose to include are a combination of the built-in behaviors identified through Dino-MITE, complete and modified behaviors from the MySkit library, as well as custom designed behaviors built in MySkit. The custom and modified MySkit behaviors were loaded onto Pleo's SD card. Each behavior has a custom command that our program uses to execute behaviors through the serial USB connection to Pleo. After identifying the behaviors used in our system, we created icons corresponding to each behavior for the "thought bubble" interface. Our program associates each behavior command with the corresponding icon.
Evaluation System implementation
The thought bubble interface was developed in Python using the pyGame library . Communications with Pleo are achieved by sending serial commands over the USB port using the pySerial libraries . For our initial prototype, we chose four different possible behaviors that Pleo can perform for each of the six stimulus points on Pleo (tail/backside, body/back, back legs, front legs, top of head, chin), for a total of 24 possible behaviors. We chose behaviors that would be memorable yet relatively brief (between four and ten seconds) that include a combination of movements We were interested in investigating whether or not children understand the thought bubble interface as a tool to access and control Pleos behaviors. Another interest was in learning how childrens focus shifted back and forth from the physical interface to the GUI. Participants and Methodology Nine children between the ages of 5 and 8 participated in our study. Three groups of children played with our system in dyads. One group of children played with the system in a triad. The investigator first briefly introduced Pleo and the thought bubble screen to the children and then left the system for the children to
play by themselves. Each group played for 20-30 minutes with our system. Since it was difficult to gain ready access to Pleos touch sensors, for our initial observation, we decided to use a Wizard of Oz approach where one of the investigators watched the children's interactions and filled in the gap by sending the touch screen interface the relevant information. A simple key press by the wizard or touch of an icon by a child triggers the program to send the corresponding behavior command to Pleo which in turn causes Pleo to act out that behavior. Results The children understood the relation between the thought bubble screen and Pleos action with respect to which part was being touched. They also understood that touching appropriate parts of the screen could actively control Pleos behaviors. Teaching Pleo behaviors had the children very engaged throughout the process. The children also remembered how to access certain types of behaviors, e.g., touching the head to access and activate the Moo sound of Pleo. The children eagerly showed each other different tricks Pleo could perform, Look what he can do! [as they touched the thought bubble screen to navigate and activate desired behaviors]. Many pairs started by focusing on physically touching Pleo and gradually moved on to interacting with the GUI once they had a better understanding of the system. None of the children completely ignored the GUI screen to focus solely on physical play with Pleo. The children did seem to understand the right hand region of the screen to be the memory bank. When asked by the investigator to explain what they thought the right hand region represented, the children answered, Thats what Pleo knows. However, the children did not use the editing
function of the memory bank to create sequences of behaviors. Instead of waiting for the sequence in memory, the children used the left side of the screen to directly control and reinforce Pleos behaviors. They seemed to interpret the memory bank as a log or history, and not something to be acted upon. Therefore, no procedural programming was observed. The function of the memory bank should be made clearer in the future and should perhaps include a go function to cycle through the list without having to touch the body part. At the beginning of the play session, the children seemed to understand the leaf as a reward they give to Pleo to reinforce desired behavior. However, the children quickly wanted to use the leaf to feed Pleo. In a future version of the system, we could provide two types of food: one to feed Pleo and the other to positively reinforce learned behaviors (like a treat). Technical Limitations We did observe the children touching different body parts at once or in rapid succession. Since Pleo is designed to complete an action before it can execute anything else, it will be unable to support commands in rapid succession. To avoid complications with Pleos communication buffer, we plan on implementing a behavior monitor that allows for interruptions and can quickly update new behaviors.
figure X. Two children interacting with both the Pleo robot and the thought bubble interface on the touch screen. In general the children started with the robot interaction, but then moved to a combination of physical interaction and GUI.
Discussion and Future Work
The evaluation of our initial design and prototype provided us with ideas for future design and implementation improvements. Specifically, the design of the memory bank needs refinement. The interface should invite children to edit and manipulate behaviors.
One idea is to make the list more like pages of a storybook, showing a chain of events (e.g., and then this happens, and then this happens after this, etc.). Even with some technical glitches, the general premise of having a physical robot and accessing its thought bubble to control the character seems promising. In response to the investigators question, What does this [pointing at the thought bubble screen] do? one child responded, You can jump to it [pointing at different behaviors on the screen] rather than [gestures touching of Pleo robot]. When the investigator asked, Do you need both the screen and Pleo? a couple of children answered, You can just take Pleo with you, but [without the screen] it would be harder to see what Pleo is thinking.
a longer study to investigate types of storytelling play children may engage in with programmable robotic characters.
We thank the children and their parents, Caleb Chung, John Sosoka, and Barbara Barza from UGOBE, and Matt Bauer for his help with Dino-MITE.
Guo, C. and Sharlin, E. Exploring the use of tangible user interfaces for human-robot interaction: a comparative study. In Proceeding of CHI 08. ACM Press (2008), 121-130.
Dino-MITE http://www.bauerindependents.com/SUBMAIN/dinomit e.htm)
We have presented a mixed physical and digital programming environment for children to control robotic characters. We have given children real-time access to what goes on in the robotic dinosaurs head so that they could better understand the process as well as what behaviors they can change or control. Children knew that they could cuddle and interact with Pleo just like a regular stuffed animal, but it was also clear to the children which part of their touch was recognized by Pleo, and what Pleo could do in response. The thought bubble interface offered children an extra lens through which they could tap into the process of the robots activity, and also direct its behavior. We are continuing to improve the interface and plan to conduct
LEGO MindStorms http://www.mindstorms.lego.com/
     
MySkit http://www.dogsbodynet.com/myskit/index.html Pleo. http://www.pleoworld.com/ pygame game development library http://www.pygame.org/news.html pyserial http://pyserial.sourceforge.net/ Raffle, H., Parkes, A., Ishii, H. Topobo: A Constructive Assembly System with Kinetic Memory, in Proceedings of CHI 04. ACM Press. Resnick, M., Martin, F., Sargent, R. and Silverman, B. Programmable Bricks: Toys to Think With. IBM Systems Journal, vol. 35, no. 3-4, pp. 443-452.
C2156TN BMW 540I Optio SVI RSH1stpe 162AE 52HM95 CS-E9ekeb HT-X250TS NN-L564wbepg Extender Marquis 2004 MS-20 Hermes X12USL COP III S70408-KG Audition MRP-F256 TDM-7581RB Teil 2 Cezai998 Euroset 825 DSC-W170 CCD-TRV77E Giulietta X5 2004 Laserjet 1150 Vivicam 3710 DEH-P9650MP K7NF2-raid Travelmate-4050 RDR-GX120 Mover TE Mediadisk SX DMC-TZ65 50PF7320 HD-61Z456 TXP50U10E IWC 7105 Mounter 24 Teamset 37LG30R H 4200 F50870 ID6294W R-234 Speedglas 9100 DX4850 Hummer H3 Automate AM1 M40-450C P MS7188D Logicom L559 G 1270 Becker Z102 Easyshare C190 Professional MK73LEV MP721C WD-80160F TK-762G Companion Review PS42Q96HD MFC-8640D RDR-GX300 SA-BX500 TX-SR508 HS6ZME ZDF2020 GSA-H55N 30112 KV-29DS1 4X4-2005 AT7-MAX2 Tecra M7 Focal Dome DP-02 Dvdr5520H-31 ICF-CD853V KA-F3030R DVD-8621N Duke R 5536G DVD-S97 CT-S280 Spac8499 Edition KDL-52VL150 Version 5 Tiida-H K8V-F Latitude X200 MV12AH P-660HN-FX Enclave ESL45010 Class RP200 TA850 RS2545SH
manuel d'instructions, Guide de l'utilisateur | Manual de instrucciones, Instrucciones de uso | Bedienungsanleitung, Bedienungsanleitung | Manual de Instruções, guia do usuário | инструкция | návod na použitie, Užívateľská príručka, návod k použití | bruksanvisningen | instrukcja, podręcznik użytkownika | kullanım kılavuzu, Kullanım | kézikönyv, használati útmutató | manuale di istruzioni, istruzioni d'uso | handleiding, gebruikershandleiding
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101