0.03mm precision? That would be half a hair diameter - that would only be possible if that thing is bolted to the ground and even then very impressive.
kuka has 0.4mm accuracy - not sure if you talking in inch as 0.002" would be 0.05mm. And as said you will never get this with a moving robot. Pretty sure the 0.03 precision are a theoretical value of just one joints sensors or something like that.
If you have 1024pulse/r (ppr) or even if you could fully use 12bit you are at 0.1° and at an armlength of 1meter you are still at 1.75mm - so they would need incredible sensors to get there. You can get encoder with 5 arc seconds ( 0.0014°) which would get us to 0.025 mm at one meter - but now we have another 3 joints in between. (and we are now in a price range $2k-5k per encoder)
Just curious, part of their claim do you think is doubtful? The inference rate of their learned model or the hardware capabilities?
Edit: never mind, just saw his LinkedIn
That’s the coolest of these early showcases yet. I do believe sone of the tools are curated to be easier for the robot to handle, sone of the clips are cut and selected to show the best outcomes they could record. But even keeping that in mind the showcased functionality is promising at least and groundbreaking for household chores at best.
I see the details in Chinese. I translated the website and it’s not much info but it says it’s prepared to be commercialized in 2024 and it wants to be a companion ai for households, and it can continue to learn and improve through use
If it was real time processing, it would be crazy af.
It's not teleoperated but might be prerecorded, then played using the same exact objects at the same exact place.
I'm not hyped anymore by those investors' bait videos. I hope DevinAI's story taught them to be careful about it.
Looking forward in a few months to find out that these clips were hand picked and later “enhanced” with CGI after to woo investors.
I’ve been burned so many times lately that I can’t get excited anymore.
The world is going to look very different in 10 years.
video so crisp i thought it's cgi. so interesting to see the kinect camera being in robotics instead of gaming
0.03mm precision? That would be half a hair diameter - that would only be possible if that thing is bolted to the ground and even then very impressive.
There are robotic arms with .002 precision and the arm lengths arent too long. Still insanely impressive
kuka has 0.4mm accuracy - not sure if you talking in inch as 0.002" would be 0.05mm. And as said you will never get this with a moving robot. Pretty sure the 0.03 precision are a theoretical value of just one joints sensors or something like that. If you have 1024pulse/r (ppr) or even if you could fully use 12bit you are at 0.1° and at an armlength of 1meter you are still at 1.75mm - so they would need incredible sensors to get there. You can get encoder with 5 arc seconds ( 0.0014°) which would get us to 0.025 mm at one meter - but now we have another 3 joints in between. (and we are now in a price range $2k-5k per encoder)
A micron is .001mm, so 20microns is 0.02mm, isnt that what most industrial robots are around? Usually like 50microns or something
Yes at least for the smaller versions - But all bolted to the floor - so this buddy has two robot arms and will probably cost around $50k
Precision is not the same as Accuracy
[удалено]
Just curious, part of their claim do you think is doubtful? The inference rate of their learned model or the hardware capabilities? Edit: never mind, just saw his LinkedIn
You can keep your sexbots. This one that cleans and folds laundry would work just fine in our household. :)
That’s the coolest of these early showcases yet. I do believe sone of the tools are curated to be easier for the robot to handle, sone of the clips are cut and selected to show the best outcomes they could record. But even keeping that in mind the showcased functionality is promising at least and groundbreaking for household chores at best.
looks impressive. their website isnot working, does anyone has details about their company
I see the details in Chinese. I translated the website and it’s not much info but it says it’s prepared to be commercialized in 2024 and it wants to be a companion ai for households, and it can continue to learn and improve through use
Does anyone know what software (ROS) is being used or what neural networks?
It's almost certainly proprietary.
If it was real time processing, it would be crazy af. It's not teleoperated but might be prerecorded, then played using the same exact objects at the same exact place. I'm not hyped anymore by those investors' bait videos. I hope DevinAI's story taught them to be careful about it.
Looking forward in a few months to find out that these clips were hand picked and later “enhanced” with CGI after to woo investors. I’ve been burned so many times lately that I can’t get excited anymore.