At a glance:
- Creating a Byte Analysis ✅ - Complete but creating an executable
- Verify analysis program with David
- Helping Andrew parse DMA
- Run analysis on previous files
- Run modified M0/M4 program and collect files for new analysis
- GUI - create plan
- Aidan suggested to look into: (msp430fr, STM32L4). I need to go over the exact use-case with David.
inter_sample = 100 us
inter_average = 500 us --> testing removal of this parameter
samples = 20
Time Analysis on parameters assuming little to no overhead where
E.g. 1.
This is about
E.g. 2.
This is about
As we increase the number of samples averaged the sample frequency decreases, as expected. We effectively have less samples per second but the samples are "steady" the averaging should work to reduce noise.
-
What is the purpose of having half a ms pause between averaging?
Originally we had it due to comments on stability but part of me feels the overhead + the high level functions we are using should be sufficient buffers that ensure stability of the unit.
✔️ D.S
We shouldn't need to have the inter_average parameter -
Do we have a desired sample frequency? If so, what is it and is it factoring in this averaging?
✔️ D.S
The desired sample frequency depends on the boards, and yes these values are taking into account the averaging.Time Constants
a. Fast board:1 ms
b. Medium Board:10 ms
c. Slow Board:150 ms
E.g.: for the medium board with a time constant of
10 ms
we will aim for about3 ms
between each averaged data point giving us about 3 points. Thus, the sample frequency is :$\frac{10^3}{3\text{ms}} \approx 300 \text{ Hz}$ -
⚠️ Follow up to question 2:⚠️
We're sure that 3 points is enough? I might be misremembering how the time constant applies to the decay plot.
Testing serial_log
against the following baud rates:
- 115,200
- 230,400
- 250,000
- 460,800
- 500,000
packages > serial_log
output files loc: analysis > tests > serial_output > data > M0_baud
Each baud contains subdirectories labeled by parameters used. For example: M0_baud > 115200 > 50_50_4
Where 50_50_4
represent interSample_interAverage_numSamples
. More info found in the README.md
Requires: New analysis tests on output files
M0 Binary storage program was successfully created.
Issue: .bin caused issues when transferring from SD to computer. After reformating the issue disappeared. It reappeared when I started storing .txt files and again the unit needed to be reformatted.
Conclusion: the binary data is now stored to .dat files and if I need to alternate between .dat or .txt a reformat needs to occur.
timeBefore (4 bytes)
timeAfter (4 bytes)
Changed the storage of a sum to 4 bytes
sum_sensorValue_A0 (2 bytes) --> (4 bytes)
sum_sensorValue_A1 (2 bytes) --> (4 bytes)
DMA: Direct Memory Access.
This should work independently of the CPU and is dedicated to data transfer.
- Diagram in process; loc:
docs > drawings > Data_Transfer.png
- Currently helping with DMA example programs; no POC is currently available
- Do we let the CPU preform the summing and store to RAM and then allow the DMA to access RAM and transfer to SD?
This might remove the gaps but it doesn't answer help with getting the files off of the unit
- Do we instead have DMA send the SD files to the Computer and let the CPU control the writing?
This would not remove the gap issue
- Will the interrupts cause a larger gap than the one's we are experiencing?
- Binary analysis prototype complete - executable required
- Remove inter_average delays on all M0 & M4 programs
- Functions in analysis require final approval
Code snippet to reference.
// Declare local variable/Buffer
uint32_t sum_sensorValue_A0 = 0;
uint32_t sum_sensorValue_A1 = 0;
// Collect time before sampling
uint32_t timeBefore = micros();
// Build buffer: read sensor value then sum it to the previous sensor value
for (unsigned int counter = 1; counter <= numSamples; counter++){
sum_sensorValue_A0 += analogRead(A0);
sum_sensorValue_A1 += analogRead(A1);
// Pause for stability
myDelay_us(intersampleDelay);
}
// Collect time after sampling
uint32_t timeAfter = micros();
// Pause for stability
myDelay_us(interaverageDelay);
// Write to file commands
file.write(timeBefore) // bytes
file.write(timeAfter)
file.write(sum_sensorValue_A0)
file.write(sum_sensorValue_A1)
d0: A0 analog value
d1: A1 analog value
tBefore
: before summing analog values
tAfter
: after summing but before writing & inter_average
delay
The median sampling time should be the time between points assuming no gaps
See Code snippet found at the start of Calculations Used in Analysis;
$n$ is matches the number of timesinter_sample
delay is called.
# Actual Sampling Time (array)
t_sampling = t2 - t1
# find median aka Actual time spent sampling
actual_sampling_time = numpy.median(t_sampling)
We expected
$t_s$ to be a stable value -- it is!
assumes no overhead and:
- doesn't account for
inter_average
delay- " " write times
- " " if else statement check
# Actual File duration (us)
actual_file_duration = t[-1] - t[0]
Starting with
$t$ found in Time per Averaged Data Point calculations
Dead time vs Expectation (us)
- actual file duration =
$F_a$ - expected file duration =
$F_e$
- let number of points =
$N$ - actual time spent sampling =
$\Delta t_a$ - expected time spent sampling =
$\Delta t_e$
Starting with
$t$ found in Time per Averaged Data Point calculations
Code Snippet from D.S's original program.
# difference between adjacent points
dt = t - np.roll(t,1)
dependent_variable = ? # this should depend on the expected spacing
# index the gaps by a threshold
## small gaps
gap_index_S = np.where(dt > dependent_variable)
gap_index_S = gap_index_S[0]
## sum all of the gaps with the smaller threshold
sum_small_gaps = np.sum(dt[gap_index_S])
## calculate the dead time
dead_time = (sum_small_gaps/tot_len_file)*100
-
What is an appropriate threshold for the smaller gaps? the median?
This is used to calculate dead time; currently it's set to the smallest gap found and sums all gaps larger than it
✔️ D.S
The threshold should be calculated using theTime Spent Sampling
aka Equation (2).
Use the following for new calculations:
Expected duration of file = CPD$\times$ #_of_points
Dead time vs. Expectation = (actual duration - expected duration) / expected duration
---note:.1 = 10% from expected
,2 = took twice as long as expected
New median dt = median(tAfter - tBefore)
New DeadTime due to Gaps = (actual_duration - #_of_points$\times$ median_dt) / (#_of_points$\times$ median_dt)
Three options
- regular merge
- takes all commits from dev branch and adds a merge commit on top
- squash and merge
- takes all dev branch commits and squashes them into one commit and puts it on main
- drawback: removes information on commits and moves them to a bloated commit
- rebase merge
- takes all dev branch commits and places them on main as is
- benefit: looks identical to working on main does not have noise about branches
No pushing to main
- Reformating the entire repo structure
- Adjusting rules for now, but eventually moving to an organization to add admin roles
- Currently we will work on PR for verification and use releases
Step 1: Identify the work you need to move to a new branch.
In the following example of git log
we see a commit ahead of origin that we need to commit to a new branch. Unfortunately, we already committed to main so we need to reset it.
git log
< This commit is ahead of origin >
commit 7a198af75335a58e5fa6439ae6cbab4f37e65674 (HEAD -> main)
Author: Drixitel <mpmunoz1993@gmail.com>
Date: Wed Jan 10 22:13:09 2024 -0800
step 1
commit d77798ca76f53b24112bfc7b007ed30205289149 (origin/main, origin/HEAD)
Author: Drixitel <mpmunoz1993@gmail.com>
Date: Wed Jan 10 21:54:53 2024 -0800
repo info
commit fd65ff2d943db06a68b180b3a7bf1ec0542e1731
Author: Drixitel <mpmunoz1993@gmail.com>
Date: Wed Jan 10 17:44:42 2024 -0800
report finalized
Step 2: Un commit
In order to undo the commit we use:
git reset HEAD~1
What this command does is reset our local main branch to the specified target (in this case HEAD~1
).
HEAD
: the current commit we are on; in our example we are on commit7a198af75335a58e5fa6439ae6cbab4f37e65674
~1
: one before; in our example it isd77798ca76f53b24112bfc7b007ed30205289149
. In general,<commit id>~X
:X
commits behindcommit id
.
The reset command undoes the commits but leaves the work done intact.
$ git reset HEAD~1
Unstaged changes after reset:
M step1.txt
$ git status
On branch main
Your branch is up to date with 'origin/main'.
Changes not staged for commit:
(use "git add <file>..." to update what will be committed)
(use "git restore <file>..." to discard changes in working directory)
modified: step1.txt
no changes added to commit (use "git add" and/or "git commit -a")
Step 3: Move changes to a new branch.
The following command will move all of your unstaged changes to a new branch myNewBranch
.
git switch -c myNewBranch
For example
$ git switch -c myNewBranch
Switched to a new branch 'myNewBranch'
$ git status
On branch myNewBranch
Changes not staged for commit:
(use "git add <file>..." to update what will be committed)
(use "git restore <file>..." to discard changes in working directory)
modified: step1.txt
no changes added to commit (use "git add" and/or "git commit -a")
Step 4: Now get changes to github.
Commit your changes
$ git add step1.txt
$ git commit -m "here we go again"
And now we PUSH to github. Github does not know about our new local branch so we need to tell it the first time we push.
$ git push -u origin HEAD
- this pushes our current branch, the branch that HEAD is currently on, to origin, which is github in this case
- now github has a branch called
myNewBranch
(the remote branch) and we still have our local branch also calledmyNewBranch
About the command git push -u origin HEAD
:
- Note when running
git pull
we are pulling from the "upstream branch" (the branch we are interested in pulling changes from, in this case github's remote branch) -u
: sets theu
pstream branch as github's copy ofmyNewBranch
Now we can git pull
and git push
as normal just as if we were on main. You can move between this branch and main by using git checkout <branch name>
. To view your branches git branch
lists the local branches only. Use git fetch origin <remote branch name>
to grab non local branches.
Removing Local
$ git branch -D <local branch>
Removing Remote
$ git push origin -d <remote branch>
$ git checkout -b <new_brach_name>
This command needs to run once - the first time you push to upstream.
$ git push -u origin HEAD
Without fetching you run the risk of creating a detached HEAD state. Do this instead.
$ git branch -r # to see remote
$ git fetch origin <branch_name>
$ git checkout <branch_name>