DICOM Standards Committee Presentation Overview

dicom meetings tokyo september 2 4 2024 n.w
1 / 14
Embed
Share

Explore the evolution from D3 to D4 in addressing new data types and use cases while maintaining interoperability. Learn about potential D4 driving motivators and challenges in efficiently managing very large data objects. Discover the nuances of transferring and processing multidimensional TB data objects in the medical imaging domain.

  • DICOM
  • Standards Committee
  • Data Types
  • Interoperability
  • Very Large Data

Uploaded on | 0 Views


Download Presentation

Please find below an Image/Link to download the presentation.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.

You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.

The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.

E N D

Presentation Transcript


  1. DICOM Meetings Tokyo September 2-4, 2024 D4 Drivers Done in D3 SEPT 2-3, 2024 PRESENTATION TO DICOM STANDARDS COMMITTEE DICOM WG-10 (STRATEGIC)

  2. D4 vs D3+ D3 has grown steadily for decades via extensions (and will continue) addresses new data types and new use cases but constrained to align with existing data models and protocol conventions those constraints limit some use cases and some solutions D4 can make changes warranted by corresponding benefits more ambitious interoperability boundaries (see next slides) new features/capabilities/performance within those boundaries still avoid unnecessary disruption maintain significant interoperability with D3 systems (See Stewardship)

  3. DSC Instructions (March) to WG-10 Explore how Motivators can be addressed as D3 extensions The Driving Motivators for D4 are valid needs. Addressing them may not need D4. Avoid significant disruption.

  4. (Potential) D4 Driving Motivators Distributed Storage Viewing Performance Very Large Objects Developer Efficiency / Appeal Security Device Management Storage/Computing Compatibility Very Large Data Collections Dataflow Performance (see Background Motivator slides for details) (Order based on asking WG10/DSC members to pick top two needs)

  5. Motivator: Very Large Data Objects Efficiently access and manage instances with high byte-count/dimensionality Large Data Objects continue to emerge (pathology, surgical video, full fMRI (volume scan per second for hours), new scan types, etc.) Larger numbers of sample bits, channels, # of pixels, frames, dimensions, Results in n-dimensional TB data objects, large series sets, etc. Challenges from PS3 limits (32bit indices, VRs) and data handling PS3 tried enhanced objects, dimension organization, frame-based access, new VRs (floating point, etc) and SOPs Needs: Transfer performance for the entire object (in some use cases) - Efficient processing (of fMRI) do statistical analysis of changes from volume to volume to make new dataset (floating point), when you rewindow you reprocess. Q. What does NIFTI have that DICOM doesn t? Slimness see Profiling Motivator. NIFTI is just a 3D array of pixels like enhanced. (Could possibly have a NIFTI that uses a file+offset pointer to access) Partial transfer (beyond frame-level retrieve) (?) Which part do you need first? Transfer priority flags or structure. What is a processable subset to start with. Synergy with the Profiling Motivator later negotiate a subset for transfer Anatomically oriented subset selection? Give me the pituitary slices (clinical understanding in the object) create segmentation as the index. Could be application, could be modality. Organ based query/retrieve Richness of instance metadata vs richness of the total dataset just the pixels, just the Type 1 tags. Recognize products that provide Type 2, and lots of Type 3. How to navigate the dimensions and select sub-sets Mechanisms (solutions) Parallelization (beyond frame-level retrieve) (?) Down sampling (for HW Limits)

  6. Motivator: Viewing Performance Efficiently organize and present clinical dataset Efficient Pixel retrieval supports good user experience & interaction responsiveness (synergy with Large Objects and Profiling) Could reduce server-side transcoding costs with better client-server convergence? Storage Servers don t recognize/optimize themselves as being a step in the display pipeline Prefetching the same study to 10 workstations is not a good solution; Clinician Time-To-First-Image is on average (1000) msec; each clinician has (300) such interactions per day; (100) msec would be effectively instantaneous, saving each radiologist 5 minutes per day, Needs Efficient Metadata retrieval to permit the viewer to plan/organize its views and pixel retrieval strategy Ability for view client to pull efficiently from the storage server (with no intermediate pre-processor) Metadata can become large; desired subset can vary with application and study type QIDO solutions depend on designing/optimizing the client & server in tandem (semi-proprietary) Specific examples Leverage deduplication/normalization of metadata & high parseability for performance Legacy conversion is hypothetically possible, but requires very advanced understanding Time to populate the series list (DICOM issue?) Having to retrieve all the instances to get all the metadata (not just series stuff, also segmentation info, measurements, etc.) Mechanisms

  7. Motivator: Distributed Storage Management Manage multiple copies of instances across federated/distributed nodes (Imaging) Data is increasingly moved, copied, shared, cached, which can be inconsistent Make copies across devices, departments, sites, intra-enterprise, inter-enterprise, country, to support study workflow/dataflow, priors, patient provided data [all ranges in scope, but not about ad hoc sharing] Benefit: supports/optimizes local architectures for data archiving, data sharing, data processing, data access/delivery PS3 originated with a local creation/local use model (and conservative industry) Doesn t address synchronization, currency, prime copy, de-duplication, deprecation, collision resolution, versions, divergent ids and codesets, Study consistency (instances in it) vs instance copy consistency What mechanisms could support inherently rather than as import/exception processes Need Identify and resolve inconsistencies of distributed copies (notification vs recatalog-polling) Inefficiencies of redundant transfers Cloud implementations (increasingly common architecture) raise questions about some of this (multi-tenancy) in inefficient ways Dumbness of implementations rather than use alt patient ids, they stuff stuff somewhere [Figure out what regional data consistency policies might look like, and what DICOM needs to do to support those] Mechanisms Reliable change logs (like Original Attributes)? Record of what was in the dataset at a point in time (e.g. original acquiaition)_ Are there ways we could have better/diligent rules for UID updates/semantics (existing rules have grey zone ) again conservative behaviors often make new UIDs unnecessarily [ UID collisions exist. It s a best effort so note use as key is not perfectly reliable][DICOM conformant behavior requiring UID collision detection at certain points]

  8. Motivator: Development Costs / Efficiency Reduce training and tooling costs/barriers for imaging development PS3 does not match current programming and tooling styles; arcane terminology; 8000page high wall; makes for higher learning curve for all new developers (Imaging Domain education still required regardless & some complexity is inherent) Approachability/ user experience of the standard users (i.e. developers) is poor [Who do we target? New creator SCUs/user agents? New client SCUs/user agents. Servers are more stable/mainstream/staffed] PS3 spec is hard to read/navigate (finding the 10 pages you need; esp for certain tasks) Custom libraries and tooling often needed (Future SW styles will continue to evolve; but with Current as basis?) Leverage the existing expertise/experience of most developers/tools RESTful HTTP (DICOMweb), JSON representations, Open Source Reference Implementations [Are there machine readable documentation conventions PS3.18 could follow to let general restful tools ingest spec?] DICOM/DICOMweb Plugins for VSCode and whatnow? Generative coding tools? Reverse the keyword vs tags choice; Support both? Less complex code; Better/cheaper validation; automated tool generation; potential community building Needs Can t find the existing solutions not well documented, arcane knowledge Mechanisms

  9. Motivator: Security Meet (current) baseline security best practices Sites commit to invest in security that improves healthcare continuity, patient safety, privacy Images aren t inherently different than security for patient info in general Better prevent ransomware & data breaches; monitor & control data access? Make image data incorruptible/immutable/write-once? Secure by (product) design Make vendors commit to ongoing maintenance of security features on all products? (RFP vs Informatics Standard?) Deidentified data access as part of the protocol; but challenging to implement (free text) Partitioning of the relevant metadata could segregate PHI elements to make security better/cheaper when composing Part 3 responses? Nominally de-identified views available Role/User-based Access control with fine grained filter (server design?) (Operator authentication friction means no operator known) Also address (secure) cross-enterprise transfer steps: (see Distributed Storage and change/ consistency tracking) (Ask Peter about DoD contact for sharing patterns/challenges) Push: consent, select data, find destination, connect/authenticate, secure transfer, validate/reconcile Pull: consent, find source, connect/authenticate, find data, select data, secure transfer, validate/reconcile User Impact: Good Tools from vendors with which to address security

  10. Motivator: Device Management Efficiently manage large numbers of imaging systems at hospitals AE-Title-based communication means IP or server changes disrupt (100 s of) clients; Manual configuration can be error prone and means significant labor and delays to recover Installation / troubleshooting take time Idiosyncratic configuration procedures on each different product makes it worse DHCP? C-ECHO to self-configure? User Impact: Staff Costs, Care Throughput/Disruption, System Robustness Vendor Impact: Service Costs

  11. Motivator: Storage/Computing Compatibility Support interoperable use of storage/compute services Imaging is storage & compute intensive; Storage/compute services offer potential cost, performance, and reliability benefits BUT such interfaces are outside the scope of the Standard Sites are thus limited to services specifically integrated by the vendor Cloud storage, very large workloads, parallel processing Avoid cached copies; access common copy via authorized read-only pointers to direct S3-like access Could a clearinghouse with such pointer access mediate/facilitate cross-enterprise sharing? Does imagining a completely cloud-based infrastructure suggest new needs? User Impact: Resource Costs, Staff Costs, System Robustness Vendor Impact: Interoperability opportunities

  12. Motivator: Very Large Collections of Data Aggregate and manage collections of very many (heterogeneous) instances Large site archives, research collections, AI training sets, registries, public health/outcomes PS3 has scaled to multi-billion instance collections, but it is challenging Index needed to find instances; sites currently design themselves; also Query Model limits Hard to find the cohort you want from the VNA(s) D4 mechanisms could help aggregate and manage such collections Patient view (longitudinal, narrow) vs cohort view (other time concepts, very broad) [DICOM Scope?] [Also business extract and analysis][Talk to large practices, etc. Get actual problems] Subscription/filtered update streams; parallel transfers/processing; bulk transfers Deduplication/normalization of metadata; summary, reduce redundancy, low transaction cost Databases with ALL metadata and don t mirror in instance headers ? Export the metadata (like Sup223) Might be another application/client that ingests and indexes (Image Manager, Image Archive, Image Indexer) TODO Pick Nick C Brain

  13. Motivator: Dataflow Performance Efficient flow of data during imaging workflow Study workflow often involves sending the same data to multiple locations/consumers; See also Distributed Storage Management Could D4 make that faster and/or decrease network traffic? Broadcast /Parallel transmission; Does this relieve a bottleneck? Lean on HTTP caching proxy (DICOM routers)

  14. Motivator: Data Profiling Facilitate apps getting specific data they need/expect Reliably getting just what you need significantly simplifies app development (DICOM has too much junk that I m not looking for. Use NIFTI. Remove clutter ; simplify. If it s there I have to figure out if it matters or not. Might need it, just not right now.) PS3 SOP Classes provide an effective contract between devices for data transfer. But since SOP Classes are a universal contract, few attributes are mandatory Many applications need more specific contracts, more predictable inputs. Some form of Profiling to agree on further constrained attributes and code sets could facilitate integrations and improve interoperability; mechanism to express the requirements [Encoding inconsistencies Lawrence examples from research archives; also usage specific differences ( eye is fine, unless you re an ophthalmologist)] Conformance as a form of Profiling content & workflow profiles Some is cross-modality (frame of reference, grayscale) others are specific (HU handling)

Related


More Related Content