In T30359#257187, @floca wrote:If I haven't missed something we can directly replace MitkCLResampleImageToReference with MitkMapImage. I can patch the according Kaapana files as soon as we have the new MITK installer. Do we have tests for the containers?
- Queries
- All Stories
- Search
- Advanced Search
- Transactions
- Transaction Logs
Feed Advanced Search
Advanced Search
Advanced Search
Mar 7 2024
Mar 7 2024
Mar 6 2024
Mar 6 2024
Feb 27 2024
Feb 27 2024
Dec 11 2023
Dec 11 2023
gaoh added a comment to T30293: Introduce new interoperability/developer-friendly scene file format (including reader).
Thank you! Yes this would already be really helpful. Allowing to only create 1 file, and if needed, just change a few properties in it, makes it much more useable.
Aug 2 2023
Aug 2 2023
So I can store files as nrrd and then use the mitkfileconverter to convert them to dcm? And this is better, then directly storeing the files as dcm?
Aug 1 2023
Aug 1 2023
Ok, since this issue made it impossible to use v2023.04 in Kaapaana mitk-flow, I am still on the previous release, where it kind of works?
So I would stay on this previous release until there is a new MITK version addressing these DICOM issues.
For now we are highly dependent on dicoms, so we don't have the option to just store the files in another file format...
Jul 4 2023
Jul 4 2023
Jun 27 2023
Jun 27 2023
gaoh updated subscribers of T29650: mitk-flow " Please reinitialize the selected segmentation image!" problem.
@k656s has an example dataset. I am still at the release version before (I think 2022.10?) but I will update to the current release. Could this already help?
If not, I would vote for the task list view reinit after loading opiton.
Jun 22 2023
Jun 22 2023
Mar 27 2023
Mar 27 2023
theoretically yes, but I guess it is not relevant. Since we now use the tasklist feature. And before that, I used the described workaround. So this is not needed anymore.
Jan 25 2023
Jan 25 2023
thanks again, for solving this issue! So in the current kaapana develop the tasklist feature is now integrated!
Dec 12 2022
Dec 12 2022
gaoh triaged T29431: [worklist] mitk-flow: save changed dicom segmentation not possible as Normal priority.
Dec 6 2022
Dec 6 2022
gaoh triaged T29428: Creating scene file with image properties can result in missing LevelWindowbar as Normal priority.
Nov 25 2022
Nov 25 2022
So the last changes in the branch, are basically the changes I made, so that she could modify the code, already used for the dicomweb/kaapana interaction, to simply apply it for this modul. Since it would have been quite similar. I guess, when the changes of the rest module are done: We (AIH-Cluster will take care of restarting the MITK-Kaapana interaction). I think this could be then a not so complicated add-on.
Nov 25 2022, 12:16 PM · Moved to git.dkfz.de, MITK (v2024.12), Restricted Project, Request for Discussion
Oh yes, I also forgot about this, the final state was, as I remember correctly, that she did not succeed in producing meaningful results.
Nov 25 2022, 12:12 PM · Moved to git.dkfz.de, MITK (v2024.12), Restricted Project, Request for Discussion
Sep 21 2022
Sep 21 2022
oh so I forgot to send it: But this is, what my browser had cashed: "Yes, so I recently tested it, with the current release."
But also, if I remember correctly, this was due to the Phantom dataset we have as the default dataset:
I removed a few slices at the end, and now it is working.
Jul 22 2022
Jul 22 2022
ok, but then the GPU problem is also feasible! If it is for a pre-known time-limited usage, like in an interactive session, I don't see a problem assigning a GPU to the container.
To point 1, in Kubernetes you can mount any path/volume to the desired mounting point. So I could just mount the directory in the MITK container in e.g. /models
Jun 1 2022
Jun 1 2022
May 17 2022
May 17 2022
yes, that will help. Also, this worklist will be quite helpful, allowing a new form of "batch-processing".
May 16 2022
May 16 2022
gaoh lowered the priority of T28884: MITK Flowbench -Save Seg not working, due to file naming issues from Normal to Low.
Yes, so I changed it already, and I am using the seriesUID as name for the images. Additionally, I had to add the layer property: Because by introducing the image name property, somehow the layer is also set (to a high number, probably). When I create a new SEG without setting the layer of the image to 0, the SEG is only put on top, when opening the data manager.
Mar 25 2022
Mar 25 2022
Jan 27 2022
Jan 27 2022
In T28886#232894, @kalali wrote:Please check if the problem persists now with T28903: [Segmentation] Monitoring of segmentation nodes in views is error prone/ not safe. merged into develop.
Jan 26 2022
Jan 26 2022
Input data to reproduce the error:
We discussed in the meeting:
- Exception handling is a good idea and should be the way to handle it. So mitk has to throw the exception and not show a success.
- Since in the input image a required tag is missing, MITK cannot provide a fix, allowing a success in dcmqi
gaoh updated the task description for T28946: MITK Flow - Save Segmentation - Error/Exception Handling.
Jan 25 2022
Jan 25 2022
gaoh closed Restricted Maniphest Task, a subtask of T27968: CTP gets stuck on large import, as Resolved.
Dec 1 2021
Dec 1 2021
gaoh triaged T28889: [Installer] Ubuntu 20.04 missing required package to start GUI as Normal priority.
Nov 30 2021
Nov 30 2021
Yes, sorry I didn't highlight it ;):
gaoh added a project to T28886: MITK Flowbench - Reopen created segmentation not working: Kaapana (internal).
Nov 3 2021
Nov 3 2021
Oct 29 2021
Oct 29 2021
In the current setup, the transfer is handled from airflow. This is also the case in the wDB-gateway. This stress test should therefore perform the same way and work. I have also a different test with random data, that work up to a limit. The system has now different recovery possibilities and can therefore handle large (random sorted) datasets better. I guess that there are sometimes sill errors, but since the system then restarts, no-one notices it. But there are still limits (depending mainly on the server (RAM)).
So I would also say this ticket is resolved for now. On a long run, changing the whole import process could remain a valid option.
Sep 10 2021
Sep 10 2021
gaoh added a comment to T28600: Kaapana containerd version is 1.2.5 which is 2 years old: Need to update that.
so, this is what we have tried so far:
helm updated to 3.6
microk9s to 1.22
first problem: v1beta1 version cannot be used anymore --> changed to v1 in the repo
next problem: our treafik is not compatible anymore (with 1.22)
gaoh lowered the priority of T28529: Replace Elasticsearch and Kibana with apache-licensed opensearch from Normal to Low.
Aug 3 2021
Aug 3 2021
Jul 19 2021
Jul 19 2021
So I tried it out. The problem are not the containeres (opensearch and opensearch-dashboard), but our plugin (workflow-trigger).
The plugin workflow-trigger has some package dependencies based on elastic/kibana. These dependencies have to be changed to the once of opensearch-dashboard.
So I tried to change it, but I get some yarn build issues.
So I am having different issues, probably it would be easier to restart the plugin form fresh. But for this, I would like to know, how the current plugin was created, to get an understanding, on how to create a similar one.
So the current issue is, that the package is searching for specific files, probably an opensearch-dashboard has to have a specific dir-order:
May 25 2021
May 25 2021
May 19 2021
May 19 2021
gaoh reopened T28380: [MultLabel Segmentation, MITK FlowBench] Segmentation not recognized as segmentation in Release v2021.02 as "Open".
The same issue seems to apply to the normal workbench, so this container has also to be updated to the current master
May 5 2021
May 5 2021
Apr 8 2021
Apr 8 2021
Mar 22 2021
Mar 22 2021
after tests of the system, the problem might be at the airflow part, this has to be tested
Mar 16 2021
Mar 16 2021
gaoh updated the task description for T28380: [MultLabel Segmentation, MITK FlowBench] Segmentation not recognized as segmentation in Release v2021.02.
Mar 15 2021
Mar 15 2021
T28207 describes how to test it without the wDB. When searching for DICOM query retrieve there are already different tasks open with all similar problems
Mar 4 2021
Mar 4 2021
gaoh moved T27968: CTP gets stuck on large import from Discussion to In Progress on the Kaapana (internal) board.
Feb 24 2021
Feb 24 2021
I cannot reproduce the error. When sending the data to an instance the data gets imported. When debugging CTP locally there also seems to be no problem. For me also it looks like the problem is not directly in the CTP but in the java library (dcm4che). Did the files got triggered/send to airflow? Did they get stuck in a quarantine folder of the CTP, and if so in which?
In T27968#220733, @schererj wrote:This is one of my model DICOMs (not 100% sure if this will reproduce it).
If not I can try to make bigger parts.
Just a remark: The download link does not work for me. I get forbidden access. This is also the case, even if I am logged into the platform. But since there is only one folder in minio I also found it without the link:)
Can you send me a dataset to reproduce it? I could try to debug CTP, but it looks like it is not even a problem in CTP but in org.dcm4chri
Feb 23 2021
Feb 23 2021
Get an Openstack instance (e.g. Ubuntu 20.04 DKFZ image,...)
Feb 22 2021
Feb 22 2021
Can you reproduce the error? If so, we can do something like: https://stackoverflow.com/questions/12096403/java-shutting-down-on-out-of-memory-error
Feb 19 2021
Feb 19 2021
Feb 18 2021
Feb 18 2021
In T28302#220475, @kislinsk wrote:I noticed that the only plugins enabled in the "MitkVolume" configuration are the segmentation plugins. Do you explicitly want to have this "slim" version or is the official WorkbenchRelease configuration more sufficient?
Feb 17 2021
Feb 17 2021
In T28302#220185, @floca wrote:And MITK volume (aka the normal workbench) uses the Release build with segmentation=ON
Maybe we should then just label it MITK Workbenches to avoid confusion and reduce (redundant) artefacts.
In case of kaapana, the name is currently irrelevant: directly in the next operator, the dcm files are send to the PACs anyway and then the file is deleted.
What would make sense, regading the naming in the PACs and when downloading files, would be to give it the name of the Instance UID.
Feb 16 2021
Feb 16 2021
so maybe a Regex is needed, or just a default file name..
Feb 12 2021
Feb 12 2021
Ah so the idea is not to build it, but to directly use a CI binary.
So MITK flow uses:
# Generate Ninja build script for MITK to build a minimum configuration with apps in Release mode into MITK-superbuild directory RUN cmake -G Ninja -S MITK -B MITK-superbuild RUN cmake -S MITK -B MITK-superbuild -D CMAKE_BUILD_TYPE:STRING=Release -D BUILD_TESTING:BOOL=OFF -D MITK_BUILD_CONFIGURATION:STRING=FlowBenchSegmentationRelease -D MITK_CUSTOM_REVISION_DESC:STRING=MitkFlow RUN cmake --build MITK-superbuild RUN cmake --build MITK-superbuild/MITK-build --target package RUN mkdir /opt/final_package RUN cp /opt/MITK-superbuild/MITK-build/MITK-MitkFlow-linux-x86_64.tar.gz /opt/final_package/MITK-MitkFlow-linux-x86_64.tar.gz
And MITK volume (aka the normal workbench) uses the Release build with segmentation=ON
For MITK-flow and MITK-volume I will switch to the new release (release/T28000-v2021.02). It was only on a develop branch, because there were some bug fixes, not yet in a master branch.
But now with the new release...
Feb 1 2021
Feb 1 2021
gaoh added a comment to T28239: Multilabel Segmentation - 3D Segmentation Confused about required outcome.
To be honest, I just did not understand the checklist correctly. If you know the outcome and read the checklists, they are fine. But for me, it was like a new test starting with loading and testing 3D Data again. So I misunderstood the line Bei den folgenden Aktionen stets mehrere Label, auch auf unterschiedlichen Layern testen. as beeing a new test, because it had of kind of title character. And that is why I thought, I would start with a 3d Image again...
Jan 29 2021
Jan 29 2021
The other options "Show as volume rendering" and "Enable auto-selection mode" are not working as described. Or I didn't get the description..