I also peeped in the custom UI tool and made that little window its not much but its a start. As for what is next, well I going to put some unfortunates souls to test the full set and see how well they adapt to Ghost movements.
Robots can feel too
quarta-feira, 25 de março de 2015
It's alive!!!
And so Ghost moves!! After some days struggling with motion planning scrips of v-rep I figured that something wasn't working to my favor, maybe it was the lack of dynamics on Ghost or just lack of a better judgment on my part but I gave up on motion planning and started to make a scrip from start.
The first issue was whatever to do a child non-threaded scrip or a threaded one. I started with a non-threaded since it was pointed to be the less heavy on the machine, but the thing was that non-threaded scrips are like function they are called one time in each iteration from scratch, and I didn't want that. so I turn in to threaded scrips and it seemed to be going great, I could create in scrip function must like in C but with a few Lua twitches so all rainbow and sunshines just follow some API functions instructions and you will be find right? Well kind off...must of those instruction teach you to create a sync function with ever iteration and that good when you reading data, bad when you trying to make some non-linear time commands, so basically for a full day my scrip worked on a loop always calling the first function or at leas the part of the code that could run in the time of each iteration (microseconds maybe less = not very much code read) and every time I tried to take that loop out well it did nothing since the scrip wasn't in sync with the simulation :D. But if you already seen the video you know that this story has a happy ending, well after some reading and studying of demo scrips manage to put the scrip in sync with the simulation but not with the iterations, with means that the scrips runs on a loop but it will only start again after it's free.
I also peeped in the custom UI tool and made that little window its not much but its a start. As for what is next, well I going to put some unfortunates souls to test the full set and see how well they adapt to Ghost movements.
I also peeped in the custom UI tool and made that little window its not much but its a start. As for what is next, well I going to put some unfortunates souls to test the full set and see how well they adapt to Ghost movements.
quinta-feira, 19 de março de 2015
Meet Ghost!
I said that I was going create a "phantom" PHUA for the new learning platform, and so I did! Say hello to Ghost!
Ghost is a replica of PHUA with no dynamic properties, wich means that he/she/it, whatever you wanna it to be, is immune to gravity and collisions, but it can be measured, detect collisions without affecting the dynamics of the others bodies and other properties. Since it can't move through the reaction on the floor I had to implement new IK chains that allowed this new body to move around its waist similar to the its predecessor. Also you can notice that he his a little green and not opaque, I chose this setting so that it can merge with the "real" PHUA without to many effort and also i'm planning on once I learn enough of V-Rep scrip I could change the colour between green/yellow/red as to with the user performance. Once I figure out how to make him move independently, I think that Ghost will prove to be a good guidance tool.
Ghost is a replica of PHUA with no dynamic properties, wich means that he/she/it, whatever you wanna it to be, is immune to gravity and collisions, but it can be measured, detect collisions without affecting the dynamics of the others bodies and other properties. Since it can't move through the reaction on the floor I had to implement new IK chains that allowed this new body to move around its waist similar to the its predecessor. Also you can notice that he his a little green and not opaque, I chose this setting so that it can merge with the "real" PHUA without to many effort and also i'm planning on once I learn enough of V-Rep scrip I could change the colour between green/yellow/red as to with the user performance. Once I figure out how to make him move independently, I think that Ghost will prove to be a good guidance tool.
terça-feira, 17 de março de 2015
Some experiments
To develop the basic maneuvers that the "phantom" PHUA will perform, I went and ask to some of my colleges to try and repeat the same maneuvers that i want develop on the "real" PHUA, this way I could get the range of comfort that first time users have before those maneuvers. So after of alot of PHUA on the floor I was able to get some data of the range of the motions to be develop. I will post here the data that I got from the experiment.
My data : frontal -> +0.040 / -0.040 || sagittal -> +0.025 / -0.025
Subject A: frontal -> +0.043 / -0.035 || sagittal -> +0.035 /- 0.020
Subject B: frontal -> +0.045 / -0.040 || sagittal -> +0.038 /- 0.022
Subject C: frontal -> +0.042 / -0.038 || sagittal -> +0.036 /- 0.035
Subject D: frontal -> +0.052 / -0.039 || sagittal -> +0.040 /- 0.040
As you can see there is a small difference between the left and right side in the frontal experiment, and the sagittal experiment. On the sagittal experiment it has to do on how quick the user was the inercia plus how bend were the knee can help the user on that stage, now on the frontal experiment it was supposed to be a mirror like dataset, I was wondering alot about this until a college of mine ask me to try those same experiments and what I got was the following:
Subject E: frontal -> +0.028 / -0.042 || sagittal -> +0.028 /- 0.024
The values were the opposite of what I was expecting it was then that i realize that my college was left handed unlike the others before
Can this affect affect the control to that degree?? Is it needed to had new inputs that distinguish right-handed user from left-handed ones? This are the questions that I pose after those trials, maybe with future experiments we will get the answers needed.
My data : frontal -> +0.040 / -0.040 || sagittal -> +0.025 / -0.025
Subject A: frontal -> +0.043 / -0.035 || sagittal -> +0.035 /- 0.020
Subject B: frontal -> +0.045 / -0.040 || sagittal -> +0.038 /- 0.022
Subject C: frontal -> +0.042 / -0.038 || sagittal -> +0.036 /- 0.035
Subject D: frontal -> +0.052 / -0.039 || sagittal -> +0.040 /- 0.040
As you can see there is a small difference between the left and right side in the frontal experiment, and the sagittal experiment. On the sagittal experiment it has to do on how quick the user was the inercia plus how bend were the knee can help the user on that stage, now on the frontal experiment it was supposed to be a mirror like dataset, I was wondering alot about this until a college of mine ask me to try those same experiments and what I got was the following:
Subject E: frontal -> +0.028 / -0.042 || sagittal -> +0.028 /- 0.024
The values were the opposite of what I was expecting it was then that i realize that my college was left handed unlike the others before
Can this affect affect the control to that degree?? Is it needed to had new inputs that distinguish right-handed user from left-handed ones? This are the questions that I pose after those trials, maybe with future experiments we will get the answers needed.
First Goal
While working on the PHUA project my first goal will be develop a platform that can help adapt future users of PHUA to its complex controlling platform. For that I will try to create a "phantom" replication of PHUA and program it to execute some basic movements while the user controlling the "real" PHUA tries to merge the two simulations while performing those movements. This adapting tool will have interaction with other tools like a sensibility tool where you, the user, can adjust the response of the joystick to your liking, a performance tool that will show you how well or bad you did in your training leaving to the trainee the choice to retry the activity maybe with different settings of sensibility in order to obtain a better score. This simulation will continue to be develop on V-Rep as the predecessor simulation was.
quinta-feira, 26 de fevereiro de 2015
Intro
Hello! My name's Daniel Marques and i'm currently developing a Master's Thesis in Mechanical Engineering at the University of Aveiro in Portugal, with the context of learning humanoid robot locomotion using haptics devices.
For those who like robotics and field breaking aplications with haptics please leave your own thoughts and discussions here so, you and I can grow more knowledgeable on the subject. With time this blog will become the feedback of my thoughts, achievements, fails and dead ends throughout the progress of my work, where my experience can help you or vice versa.
So help me help you to show what our robots are feeling and with it expand the horizon of human/robot interaction.
For those who like robotics and field breaking aplications with haptics please leave your own thoughts and discussions here so, you and I can grow more knowledgeable on the subject. With time this blog will become the feedback of my thoughts, achievements, fails and dead ends throughout the progress of my work, where my experience can help you or vice versa.
So help me help you to show what our robots are feeling and with it expand the horizon of human/robot interaction.
Subscrever:
Mensagens (Atom)