h a l f b a k e r y0.5 and holding.
add, search, annotate, link, view, overview, recent, by name, random
news, help, about, links, report a problem
browse anonymously,
or get an account
and write.
register,
|
|
|
This is an exercise / goal for AI controlled surgical or assembly line robots.
Take a brick and break it up with a chisel. Spread the pieces around and have the AI scan each piece and compute where they'd go to all fit back together. The cutting edge robot fingers would then stack them together from
the ground up, reassembling the brick into its original shape.
Being apt at evaluating the 3D environment of the area being operated on or of the things being assembled will be vital for the robot surgeons / workers of the future. This challenge / exercise, would be one way to make sure it's up to the task, and showing patients its ability to do this might help to engender trust in this new technology for surgery but certainly be a good exercise towards precision product assembly.
https://m.youtube.com/watch?v=uQ8ehsA4nS0
Your surgeon will see you now hahahaha [xenzag, Jul 23 2025]
A real robot video.
https://www.youtube...watch?v=CacWd64RJhM They're not autonomous yet, but at some point they will be where it's appropriate. [doctorremulac3, Jul 23 2025]
[link]
|
|
Trivial to solve, not applicable to surgery, not related to precision beyond what lego robots can already do. Sorry. |
|
|
No need to apologize, but robots can't currently look at and evaluate pieces of a smashed brick and piece them together. Lego pieces are a set size and a 2 year old can assemble them. This would simply be an exercise to evaluate and fine tune these multi armed robot systems. |
|
|
Being able to focus in on and evaluate the various randomly shaped pieces in a 3D environment, create a database of how they're related and use that database to assemble, work on or remove them would be worth perfecting. |
|
|
Just an update on the whole autonomous robo-everything future. Being that I'm in Silicon Valley, I have driverless cars drive by my house on a regular basis and interact them them all the time at intersections, changing lanes etc. They're very good drivers. The robots are already here. |
|
|
The video of the robot going berserk is fake by the way. |
|
|
> robots can't currently look at and evaluate pieces of a smashed brick and piece them together. |
|
|
There probably isn't a program for it, but making one is a trivial project. You just need to use a puzzle solver program in 3D. I bet even an LLM can do it. Said program shouldn't be AI, throwing in AI would only slow it down. |
|
|
Here are the stubs written by Grok 4: |
|
|
import numpy as np
import open3d as o3d
import json |
|
|
def prep_seg(
image_paths: list[
str], bg_thresh: float
= 0.5, calib: dict
= None) -> list[dict
]:
"""
Preprocess images
and segment frags.
:param image_paths
: List of image
file paths.
:param bg_thresh:
Threshold for
background seg.
:param calib:
Optional camera
calib data.
:return: List of
dicts with seg
data per frag.
"""
pass |
|
|
def recon_3d(
seg_data: list[dict],
sfm_params: str =
'default') -> list[
dict]:
"""
Reconstruct 3D
models for each
frag.
:param seg_data:
Output from
prep_seg.
:param sfm_params:
Params for SfM.
:return: List of
dicts with 3D
models per frag.
"""
pass |
|
|
def ext_feats(
models_3d: list[dict
]) -> list[dict]:
"""
Extract geom
features from 3D
models.
:param models_3d:
Output from
recon_3d.
:return: List of
dicts with
features per frag.
"""
pass |
|
|
def match_pairs(
features: list[dict],
models_3d: list[dict
]) -> dict:
"""
Perform pairwise
matching and init
transforms.
:param features:
Output from
ext_feats.
:param models_3d:
3D models for
alignment.
:return: Sim graph
dict.
"""
pass |
|
|
def opt_glob(
sim_graph: dict,
models_3d: list[dict
]) -> dict:
"""
Optimize global
assembly config.
:param sim_graph:
Output from
match_pairs.
:param models_3d:
3D models.
:return: Dict of
optimized trans
per frag.
"""
pass |
|
|
def comp_trans(
opt_pos: dict,
ref_frag_id: int
) -> tuple[dict,
dict]:
"""
Compute relative
transforms and
validate.
:param opt_pos:
Output from
opt_glob.
:param ref_frag_id
: ID of ref frag.
:return: Tuple of
(rel trans dict,
val metrics dict).
"""
pass |
|
|
def gen_out(
trans: dict,
metrics: dict,
out_path: str =
'assembly.json',
vis: bool = False
) -> str:
"""
Generate output
file and optional
vis.
:param trans:
Relative trans.
:param metrics:
Val metrics.
:param out_path:
Path to save JSON.
:param vis: Flag
to generate 3D
render.
:return: Path to
output file or
vis.
"""
pass |
|
|
I could then go on to have Grok write the module for each stub, check data passing, and then write the i/o for the hardware but I'm much too lazy for that, especially since no one is going to make this thing. |
|
|
So they already have this and you just have to push a button for it to be done but nobody would ever push the button? |
|
|
Well then, I suggest they DO push the button. |
|
|
As far as the Grok program, I was just about to say that exactly but Grok beat me to it. |
|
|
Well, that's the program for accepting raw camera data and turning it into rotation and movement vectors. You also need to coat the appropriate sides in adhesive and give directions to actually re-assemble the thing. As for me, I would like to live on the Wall-e cruise ship Axiom. |
|
|
But would be nice to step outside onto various planets and fight really badly costumed lizard aliens from time to time. |
|
|
I'll settle for a miniature version one of those Chinese robots running amok being lowered into a bucket of paint to act as a mixer. |
|
|
Well there you go. I'd bun that. |
|
| |