Page MenuHomePhabricator

SegmentAnyBone in MITK
Open, NormalPublic

Description

SegmentAnyBone is a foundational model based bone segmentation algorithm adapted from Segment Anything Model (SAM) for MRI scans. It is able to segment bones in the following 17 body parts:
Humerus,Thoracic Spine, Lumbar Spine, Forearm, Pelvis, Hand, Lower Leg, Shoulder, Chest, Arm, Elbow, Hip, Wrist, Thigh, Knee, Foot, Ankle

Paper:
https://arxiv.org/abs/2401.12974
Code:
https://github.com/mazurowski-lab/SegmentAnyBone

An overall feasibility test should be done to check if it's worth (both interest-wise & effort-wise) integrating into MITK.

Related Objects

StatusAssignedTask
OpenNone
OpenNone

Event Timeline

a178n triaged this task as Normal priority.Jan 26 2024, 11:47 AM
a178n created this task.

SegmentAnyBone is SAM with an extra attention pathway as well, pretrained on MRI dataset.
They have trained the visual transformer vit_t type model and released new pretrained weights. From code perspective, they have modified down SAM code to include attention map and packs with it in the repo.
Salient points:

  • Auto segmentation only.
  • There no out of the box support inference code. We need to have a script bridging MITK and Segment Anybone; just like current SAM tool.
  • Eventhough SAM only code supports 2D images, Segment Anybone can take 3D info into account in the attention map generation workflow.
  • CC BY-NC 4.0 license.

Pain point:
The pip installation of the repo was not straightforward as they claim. The requirements file contains some outdated versions of some packages. Looks like the code development was done in a virtual env created a few years before using python 3.7 or so.