
Hall B Magnet FastDAQ Support Group Overview
"Explore issues in the current configuration and performance of the Hall B Magnet FastDAQ system. Learn about timestamp jitter, deployed code details, and the development code improvements for effective data processing and analysis. Join the Detector Support Group for insights and solutions."
Download Presentation

Please find below an Image/Link to download the presentation.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author. If you encounter any issues during the download, it is possible that the publisher has removed the file from their server.
You are allowed to download the files provided on this website for personal or commercial use, subject to the condition that they are used lawfully. All files are the property of their respective owners.
The content on the website is provided AS IS for your information and personal use only. It may not be sold, licensed, or shared on other websites without obtaining consent from the author.
E N D
Presentation Transcript
Hall B Magnet FastDAQ Brian Eng Detector Support Group August 27, 2018 1
Overview Issues with the current configuration What the existing deployed looks like and how it performs Current development code and performance What worked and didn t Conclusion Detector Support Group 2 5/9/2025
Timestamp Jitter Data is read from cRIO ADC modules and written to 2000 element EPICS arrays (which automatically add a timestamp) every 200 ms 2000 element array is due to a bug in NI EPICS Server not being able to set a larger array size than the default Calculating a time delta between sample timestamps < 100 ms considered duplicate > 300 ms considered a miss Ideally this should be exactly 200 ms Detector Support Group 3 5/9/2025
Deployed Code Single timed while loop with a sequence Each sequence runs at 100 ms All calculations (min, avg, max) are done in the loop Not all ADC channels are used Array manipulation to removed unused channels Writing to EPICS & PLC done in the loop Detector Support Group 4 5/9/2025
Deployed Code (Block Diagram) Array manipulations EPICS & PLC Write ADC Read Detector Support Group 5 5/9/2025
Deployed Code (Jitter) Detector Support Group 6 5/9/2025
Development Code (as of Today) Separate loops for reading ADC and all other functionality Use RT FIFO to pass data between the loops Update cRIO Waveform Reference Library Can return the data as an interleaved 1D array (no array manipulations done) Array manipulations and calculations done in a separate loop Writing to EPICS and PLC are separate VI calls Using 32-bit floating point instead of 64-bit Detector Support Group 7 5/9/2025
Development Code (Block Diagram) RT FIFO ADC Read PLC Write EPICS Write Detector Support Group 8 5/9/2025
Development Code (Jitter) Detector Support Group 9
The Good, Bad, and Ugly Moving to 32-bit floating point was a big improvement Since data from ADC modules is 32-bit no loss of precision Separating out functionality allowed easier measurement of timing of individual functionality DAQmx API is much simpler and faster to deploy than FPGA, but worse timing (~13ms vs ~5ms RMS). Adding timeouts to some functions (e.g. RT FIFO Read) completely killed any jitter gains, 2 peaks at 200ms & 400 ms Detector Support Group 10 5/9/2025
Conclusions New code has significantly less jitter, but at the expense of completely missing a sample Still much less frequent than current code Most files have 0 timestamp issues Current code has timing issues on nearly every single file generated (2 GB ROOT files, ~30 min) Still a work in progress A slower ADC read rate would eliminate all timestamp issues independent of the code Detector Support Group 11 5/9/2025
BACKUPS Detector Support Group
DAQmx (left) vs FPGA (right) 5/9/2025 Detector Support Group 13
RT FIFO Read with Timeout Detector Support Group 14 5/9/2025