Please Help! Considering address alignment, what is the worst case time required to fetch a 24-byte data object from memory with a 60 ns cycle time and a data bus that is 64 bits wide?
I googled it to find the answer to be Worst-case time required = 4 cycles × 60ns = 240 ns. I don't have any knowledge in this (and most certainly about how to find worst case time) but an average case should be ((24 * 8 bits) / 64 bits) * 60ns = 180ns
Thank you once again! We (the class) are on Chapter 5 and 6 of Fundamentals of Embedded Software with the Arm Cortex-M3. I searched the PDF for worst case time (Found in Chapter 8) and saw the worst case time is calculated: "Estimating the worst-case latency from time of interrupt request to the moment that data is read from the device is relatively straightforward. We simply add one cycle to complete or abandon the current instruction, the time required for stacking and the rest of the exception response, and the execution time of the first two instructions in the ISR required to actually read the data from the device." I am assuming one more cycle, 60ns must be added to consider worst case time.
I still am not sure about the method. Do tell me when you get an approved solution (It's not in my curriculum but learning a bit won't hurt).
Join our real-time social learning platform and learn together with your friends!