[phpBB Debug] PHP Warning: in file [ROOT]/includes/bbcode.php on line 494: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead
[phpBB Debug] PHP Warning: in file [ROOT]/includes/bbcode.php on line 494: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead
[phpBB Debug] PHP Warning: in file [ROOT]/includes/bbcode.php on line 494: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead
[phpBB Debug] PHP Warning: in file [ROOT]/includes/bbcode.php on line 494: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead
[phpBB Debug] PHP Warning: in file [ROOT]/includes/bbcode.php on line 494: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead
[phpBB Debug] PHP Warning: in file [ROOT]/includes/bbcode.php on line 494: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead
[phpBB Debug] PHP Warning: in file [ROOT]/includes/bbcode.php on line 494: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead
[phpBB Debug] PHP Warning: in file [ROOT]/includes/bbcode.php on line 494: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead
[phpBB Debug] PHP Warning: in file [ROOT]/includes/bbcode.php on line 494: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead
[phpBB Debug] PHP Warning: in file [ROOT]/includes/bbcode.php on line 494: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead
[phpBB Debug] PHP Warning: in file [ROOT]/includes/bbcode.php on line 494: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead
[phpBB Debug] PHP Warning: in file [ROOT]/includes/bbcode.php on line 494: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead
[phpBB Debug] PHP Warning: in file [ROOT]/includes/bbcode.php on line 494: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead
[phpBB Debug] PHP Warning: in file [ROOT]/includes/bbcode.php on line 494: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead
[phpBB Debug] PHP Warning: in file [ROOT]/includes/bbcode.php on line 494: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead
[phpBB Debug] PHP Warning: in file [ROOT]/includes/bbcode.php on line 494: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead
[phpBB Debug] PHP Warning: in file [ROOT]/includes/bbcode.php on line 494: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead
[phpBB Debug] PHP Warning: in file [ROOT]/includes/bbcode.php on line 494: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead
[phpBB Debug] PHP Warning: in file [ROOT]/includes/bbcode.php on line 494: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead
[phpBB Debug] PHP Warning: in file [ROOT]/includes/bbcode.php on line 494: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead
[phpBB Debug] PHP Warning: in file [ROOT]/includes/bbcode.php on line 494: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead
[phpBB Debug] PHP Warning: in file [ROOT]/includes/bbcode.php on line 494: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead
[phpBB Debug] PHP Warning: in file [ROOT]/includes/bbcode.php on line 494: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead
[phpBB Debug] PHP Warning: in file [ROOT]/includes/bbcode.php on line 494: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead
[phpBB Debug] PHP Warning: in file [ROOT]/includes/bbcode.php on line 494: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead
[phpBB Debug] PHP Warning: in file [ROOT]/includes/bbcode.php on line 494: preg_replace(): The /e modifier is no longer supported, use preg_replace_callback instead
[phpBB Debug] PHP Warning: in file [ROOT]/feed.php on line 181: Cannot modify header information - headers already sent by (output started at [ROOT]/includes/functions.php:3903)
[phpBB Debug] PHP Warning: in file [ROOT]/feed.php on line 182: Cannot modify header information - headers already sent by (output started at [ROOT]/includes/functions.php:3903)
Open Source Instruments Forum Message boards for technical support and customer questions. 2024-08-01T08:37:01-05:00 https://opensourceinstruments.com:443/Forums/feed.php?f=3&t=30 2024-08-01T08:37:01-05:00 2024-08-01T08:37:01-05:00 https://opensourceinstruments.com:443/Forums/viewtopic.php?t=30&p=127#p127 <![CDATA[Re: reading NDF files in python]]>
I apologize for missing some of your previous posts; I hadn't realized there was a second page for the discussion... It turns out most of my questions had already been answered!

1) We are missing 1-second intervals as the Neurorecorder synchronizes with the computer, but the file names and lengths are consistent. Do you agree that this is the correct behavior?
I do agree that this is the correct behavior.

2) Please try the Exporter in Neuroplayer 171 and let me know if it eliminates overlap and correctly names the export files.
Thank you for addressing the issue we discussed in this update. I will provide feedback as soon as I have tested this new version.

3) Do you really mean 10.6.10?
I hadn’t seen the post mentioning version LWDAQ 10.6.11, so we installed the previous version. We will migrate to the new version as soon as possible.

4) I tested 10.6.11 with hour-long archives and 8-second intervals. It did not produce extra files for me. If you are using 10.6.11 and still see this error, please let me know. I may need more samples of your hour-long files. Does this problem occur with every archive, or only some?
I will test version 10.6.11 and check if the error persists.

Thank you very much for your help. We are grateful.

Best wishes,
Raphaël

Statistics: Posted by Raphaël Nunes — Thu Aug 01, 2024 9:37 am


]]>
2024-07-29T11:33:27-05:00 2024-07-29T11:33:27-05:00 https://opensourceinstruments.com:443/Forums/viewtopic.php?t=30&p=126#p126 <![CDATA[Re: reading NDF files in python]]>
I am sorry to hear that you are still having problems with the exporter.

> We have installed the latest version of LWDAQ (10.6.10) on our new recording machine.

The latest version is LWDAQ 10.6.11. Do you really mean 10.6.10? You can download 10.6.11 from here:



or here:



> 2) The 8-second overlap still exists when exporting files with an 8-second processing interval.

I tested 10.6.11 with hour-long archives and 8-second intervals. For me it did not produce these extra files. If you are using 10.6.11 and you still see this error, then I will test again. I may need some more samples of your hour-long files. Does this problem occur with every archive, or only with some archives?

> 3) When exporting in EDF format with the latest version of Neuroplayer, the exported files use "cnt" as units instead of mVs (millivolts).

That's because you lost your Exporter settings when you updated your LWDAQ. You need to set up the conversion between "cnt" (sixteen-bit analog to digital converter counts) to mVs using the EDF setup button. See the EDF section here:



You can read the EDF header from one of your older files with the Read button in the EDF Setup Panel.

> Additionally, we are not familiar with the "cnt" unit and cannot convert it to mVs.

The A3047A1B transmitters you are using have three voltage inputs with dynamic range 60 mV, 120 mV and 120 mV. This range is asymmetric about zero volts. For 60-mV, the input range is -36 mV to +24 mV. For the 120 mV, it is -72 mV to +48 mV. In the EDF setup, you set the units to "mV" and you set "min" and "max" to the two range limits. The temperature sensor entry allows you to convert the digitized temperature sensor output into centigrade, like this:



Best Wishes, Kevan

Statistics: Posted by Kevan Hashemi — Mon Jul 29, 2024 12:33 pm


]]>
2024-07-29T10:42:51-05:00 2024-07-29T10:42:51-05:00 https://opensourceinstruments.com:443/Forums/viewtopic.php?t=30&p=125#p125 <![CDATA[Re: reading NDF files in python]]>
Thank you again for your help.

We have installed the latest version of LWDAQ (10.6.10) on our new recording machine. After using this new machine to record and export the signals, here are some observations:

1) Although the Neuroplayer no longer crashes, it now creates two files when exporting: one containing the full 3600 seconds and another with only the last 8 seconds.

2) The 8-second overlap still exists when exporting files with an 8-second processing interval.

3) When exporting in EDF format with the latest version of Neuroplayer, the exported files use "cnt" as units instead of mVs (millivolts). This is problematic for us because we built our processing scripts based on the previously used mVs units. Additionally, we are not familiar with the "cnt" unit and cannot convert it to mVs. Is there a way to modify the export unit in version 10.6.10 of Neuroplayer? This would be very useful to us.

Best wishes,
Raphaël

Statistics: Posted by Raphaël Nunes — Mon Jul 29, 2024 11:42 am


]]>
2024-07-22T12:30:13-05:00 2024-07-22T12:30:13-05:00 https://opensourceinstruments.com:443/Forums/viewtopic.php?t=30&p=123#p123 <![CDATA[Re: reading NDF files in python]]>
Thank you for your patience. We just released LWDAQ 10.6.11 with Neuroplayer 171 and Neurorecorder 168. You can pull from our Git repository:



Or download and unzip our new archive:



The default behavior of the Neurorecorder is now "free-running" rather than "re-synchronizing". The Synchronization box will be un-checked. We discuss the two synchronization strategies here:



Please try the Exporter in the Neuroplayer 171 and let me know if it eliminates overlap and assigns the correct names to the export files.

Best Wishes, Kevan

Statistics: Posted by Kevan Hashemi — Mon Jul 22, 2024 1:30 pm


]]>
2024-07-17T16:17:55-05:00 2024-07-17T16:17:55-05:00 https://opensourceinstruments.com:443/Forums/viewtopic.php?t=30&p=117#p117 <![CDATA[Re: reading NDF files in python]]>
Returning to this behavior:

> Once the conversion to EDF is complete:
> “M1702291203.ndf” has been converted to “E1702291203.edf”
> “M1702294804.ndf” has been converted to “E1702294795.edf”

I am able to reproduce this behavior easily. I am recording 60-s files with synchronization at the start of each new file. I am exporting 60-s files, starting with the first file. I see the following conversions:

M1721250458.ndf -> M1721250458.edf
M1721250519.ndf -> E1721250516.edf
M1721250580.ndf -> E1721250575.edf
M1721250641.ndf -> E1721250634.edf

The problem arises when the file and the export length are the same. There is a logical error in my code for this particular situation. I believe the correct behavior should be:

M1721250458.ndf -> M1721250458.edf
M1721250519.ndf -> M1721250519.edf
M1721250580.ndf -> M1721250580.edf
M1721250641.ndf -> M1721250641.edf

Here we are missing 1-s intervals as the Neurorecorder synchronizes with the computer, but the file names are the same and all the files are the same length. Do you agree that this is the correct behavior?

Best Wishes, Kevan

Statistics: Posted by Kevan Hashemi — Wed Jul 17, 2024 5:17 pm


]]>
2024-07-16T15:23:55-05:00 2024-07-16T15:23:55-05:00 https://opensourceinstruments.com:443/Forums/viewtopic.php?t=30&p=116#p116 <![CDATA[Re: reading NDF files in python]]>
I have just exported all four of the files you sent me, no crash or freeze or stop, no error message. I'm using LWDAQ 10.6.10, Neuroplayer 170. The metadata in your NDF files says, "Creator: Neurorecorder 162, LWDAQ_10.5.2." The LWDAQ GitLog has the following entry just before the release of LWDAQ 10.5.4:

commit a188b7552d0c1ed4f3fd01708d6eb1b623f918f6
Author: Kevan Hashemi <hashemi@opensourceinstruments.com>
Date: Mon Feb 27 17:32:23 2023 -0500
Fixed Neuroexporter freeze bug in Windows.

A "freeze" bug is different from a "crash" bug. But we did a lot of work on the Neuroexporter between 10.5.3 and 10.5.4. I suggest you try upgrading your LWDAQ. If you are using our GitHub repository (link below), just do "git pull" to get the latest pre-release version.



If you install using our multi-platform ZIP archive, use this link to get LWDAQ 10.6.10:



Please try the new version and see if it can export without stopping.

Best Wishes, Kevan

Statistics: Posted by Kevan Hashemi — Tue Jul 16, 2024 4:23 pm


]]>
2024-07-16T12:08:44-05:00 2024-07-16T12:08:44-05:00 https://opensourceinstruments.com:443/Forums/viewtopic.php?t=30&p=115#p115 <![CDATA[Re: reading NDF files in python]]>
Thank you for your detailed explanation of the problems with overlap between export files.

> Once the conversion to EDF is complete:
> “M1702291203.ndf” has been converted to “E1702291203.edf”
> “M1702294804.ndf” has been converted to “E1702294795.edf”

This does not look right to me. That's not what I wanted the Exporter to do. There may be something unusual about your NDF file that is causing the Exporter to behave badly. I will start by fixing the bug in the Exporter that causes the Exporter to crash the Neuroplayer. After that, I will look at why the Exporter is creating these overlaps.

Right now, I think that both the problems you are having with the Exporter are due to problems in our Exporter code. Which is good news, in a way, because we can fix problems in our code. In the meantime, thank you for your patience, and thank you for answering all my questions.

Best Wishes, Kevan

Statistics: Posted by Kevan Hashemi — Tue Jul 16, 2024 1:08 pm


]]>
2024-07-16T12:03:42-05:00 2024-07-16T12:03:42-05:00 https://opensourceinstruments.com:443/Forums/viewtopic.php?t=30&p=114#p114 <![CDATA[Re: reading NDF files in python]]>
> What I mean by “crash” is that the Neuroplayer shuts down without returning any error message.
> It basically quits and disappears, as you say.

Okay, thank you for clarifying. That's a real crash, and that should never happen. It's a bug in my code.

> When the Neuroplayer crashes, the Neurorecorder doesn’t seem to be affected and keeps recording.

Good. They are separate processes, so I expect the Neurorecorder to keep running.

> We have attempted exporting the same folder multiple times, and it is always the same files that cause the Neuroplayer to crash.

I am glad to hear that: I will be able to reproduce the problem and fix the bug.

> I will send you these files by mail.

I have them, and will look at them today.

> Visualizing the files responsible for the crashes with the Neuroplayer (using the “Play” button) poses no issues.
> It is only their export that causes the software to crash.

That is interesting. Thank you. You will hear from me soon.

Best Wishes, Kevan

Statistics: Posted by Kevan Hashemi — Tue Jul 16, 2024 1:03 pm


]]>
2024-07-16T12:01:15-05:00 2024-07-16T12:01:15-05:00 https://opensourceinstruments.com:443/Forums/viewtopic.php?t=30&p=113#p113 <![CDATA[Re: reading NDF files in python]]>
> Could you please confirm to me that the only hardware requirement for the machine is a
> supplementary network port?

Correct.

> Can we use a linux OS instead of windows.

Yes, you can use Linux, MacOS, or Windows. To run on Linux you will probably start with "./lwdaq" from the terminal, see here:



Best Wishes, Kevan

Statistics: Posted by Kevan Hashemi — Tue Jul 16, 2024 1:01 pm


]]>
2024-07-16T10:40:44-05:00 2024-07-16T10:40:44-05:00 https://opensourceinstruments.com:443/Forums/viewtopic.php?t=30&p=112#p112 <![CDATA[Re: reading NDF files in python]]>
quick detail to add to Raphaël post:
when the neuroplayer crashes during the exporting it disappears with no error message.

Also, even if it's reassuring to know that even a machine running at 98% of the CPU should not prevent the good functioning of your software I would like to upgrade the acquisition machine to a more powerful model.

Could you please confirm to me that the only hardware requirement for the machine is a supplementary network port ?
Can we use a linux OS instead of windows.

Many thanks again for your precisous help!

Marco

Statistics: Posted by MNPompili — Tue Jul 16, 2024 11:40 am


]]>
2024-07-16T10:34:58-05:00 2024-07-16T10:34:58-05:00 https://opensourceinstruments.com:443/Forums/viewtopic.php?t=30&p=111#p111 <![CDATA[Re: reading NDF files in python]]>
Thank you for your rapid response.

Allow me to clarify a few elements.

"When you say "crash," does the Neuroplayer quit and disappear? Or does it freeze so that it will not respond to button presses? Or does it stop with an error message in its text window? If there is an error message, what is the message?"

What I mean by “crash” is that the Neuroplayer shuts down without returning any error message. It basically quits and disappears, as you say.

Here is some information that might be useful:

1) When the Neuroplayer crashes, the Neurorecorder doesn’t seem to be affected and keeps recording.

2) We have attempted exporting the same folder multiple times, and it is always the same files that cause the Neuroplayer to crash. I will send you these files by mail.

3) Visualizing the files responsible for the crashes with the Neuroplayer (using the “Play” button) poses no issues. It is only their export that causes the software to crash.

I hope this information is helpful to you.

Best wishes,
Raphaël Nunes da Silva

Statistics: Posted by Raphaël Nunes — Tue Jul 16, 2024 11:34 am


]]>
2024-07-15T13:34:46-05:00 2024-07-15T13:34:46-05:00 https://opensourceinstruments.com:443/Forums/viewtopic.php?t=30&p=110#p110 <![CDATA[Re: reading NDF files in python]]>
> We were not aware that it was possible to export data in EDF format while recording. After checking the documentation,
> we found this could be very helpful for us.

Good.

> However, when we export data with Neuroplayer on the machine used for recording data with Neurorecorder, we often
> experience crashes during the export process.

When you say "crash", does the Neuroplayer quit and disappear? Or does it freeze so that it will not respond to button presses? Or does it stop with an error message in its text window? If there is an error message, what is the message?

The Neuroplayer should never quit or freeze during the export process. If it does, there is a bug in my code that I have to fix. So long as the computer operating system keeps running, the Neuroplayer should keep running or stop and give you an error message.

> The only reason we can think of that could cause these crashes is the limited computational capacity of the computer
> used to record the signals.

Limited computational capacity should not cause a crash. I think it is more likely that there is a bug in my exporter code.

> Indeed CPU usage is at 97-100% when recording and exporting at the same time.

In that case, the Neuroplayer will not be able to export as fast as the data is recorded. But the Neuroplayer should not crash.

> Therefore, we aim to transfer the acquisition to a more powerful computer to record and export the signals
> simultaneously.

There is no harm in using a faster computer. But if it's a bug in my code, the bug will occur again. If the Neuroplayer crashes, please send me the NDF file it was exporting when it crashed. I will then be able to fix the problem, and you can export during recording.

> Is there a specific internal extension card (or another specific component) we have to add to the new machine we are
> assembling?

No. The Neuroplayer uses standard CPU instructions.

I'm going to study your answers to my synchronization questions now.

Best Wishes, Kevan

Statistics: Posted by Kevan Hashemi — Mon Jul 15, 2024 2:34 pm


]]>
2024-07-15T10:44:59-05:00 2024-07-15T10:44:59-05:00 https://opensourceinstruments.com:443/Forums/viewtopic.php?t=30&p=109#p109 <![CDATA[Re: reading NDF files in python]]>
Thank you very much for your precious help.

We have carefully analyzed your suggestions and there are a few elements that we would like to clarify with you.

1) “If I were you, I would export as you are recording, so the export file is always ready to use.”

We were not aware that it was possible to export data in EDF format while recording. After checking the documentation, we found this could be very helpful for us. However, when we export data with Neuroplayer on the machine used for recording data with Neurorecorder, we often experience crashes during the export process.

The only reason we can think of that could cause these crashes is the limited computational capacity of the computer used to record the signals. Indeed CPU usage is at 97-100% when recording and exporting at the same time. Therefore, we aim to transfer the acquisition to a more powerful computer to record and export the signals simultaneously. Is there a specific internal extension card (or another specific component) we have to add to the new machine we are assembling?

2) “How big is the overlap? How long are your individual NDF files? Do you have the Synchronize button checked in Neuroplayer when you record? How long is the overlap? Is it seconds, minutes, or tens of minutes?”

Let me better explain our overlap problem with an example.

Imagine we want to export the successive NDF files “M1702291203.ndf” and “M1702294804.ndf” to EDF format and we have chosen a processing interval of 8 seconds in the Neuroplayer menu.

Each of these two files corresponds to a one-hour recording, which is well reflected by the Unix-Time index difference in the file names: 1702294804 - 1702294804 = 3601 seconds.

Once the conversion to EDF is complete:
- “M1702291203.ndf” has been converted to “E1702291203.edf”
- “M1702294804.ndf” has been converted to “E1702294795.edf”

Thus, the Unix-Time index difference in the EDF file names is no longer 3601 seconds but 1702294795 - 1702291203 = 3592 seconds. This results in an overlap of 3600 - 3592 = 8 seconds between the two recordings, with both EDF files having a duration of 3600 seconds.

Similarly, if we choose a processing interval of 2 seconds:
- “M1702291203.ndf” is converted to “E1702291203.edf”
- “M1702294804.ndf” is converted to “E1702294801.edf”

We therefore have an overlap of 3600 - (1702294801 - 1702291203) = 3600 - 3598 = 2 seconds. Thus, the overlap matches the duration of the processing interval.

Likewise, if we launch the successive export of “M1702291203.ndf”, “M1702294804.ndf”, and “M1702298405.ndf” with an 8-second processing interval:
- “M1702291203.ndf” is converted to “E1702291203.edf” (1)
- “M1702294804.ndf” is converted to “E1702294795.edf” (2)
- “M1702298405.ndf” is converted to “E1702298389.edf” (3)

Thus, we observe:
- A temporal shift of 8 seconds between “M1702294804.ndf” and “E1702294801.edf” (2)
- A temporal shift of 16 seconds between “M1702298405.ndf” and “E1702298389.edf” (3)

We can then hypothesize that: temporal shift = number of files * processing interval duration.

However, if we launch the export (still with an 8-second processing interval) of two batches of files separately, for instance:
- First of “M1702291203.ndf” and “M1702294804.ndf”
- And then of “M1702298405.ndf” and “M1702302006.ndf”

Then,
- “M1702291203.ndf” is converted to “E1702291203.edf” (1)
- “M1702294804.ndf” is converted to “E1702294795.edf” (2)
- “M1702298405.ndf” is converted to “E1702298405.edf” (3)
- “M1702302006.ndf” is converted to “E1702301998.edf” (4)

Thus, we observe:
- No temporal shift for the first exported files (1) and (3)
- An 8 seconds shift for the second exported files (2) and (4)

This is problematic for us because, if a crash occurs during an export, we have to reindex all the remaining files to be exported before starting their export.

Therefore, could you please let us know if this is normal and, if so, if it is possible to avoid this overlap?

Regarding the Synchronize button, we do check it when we record our data with Neurorecorder.

Best wishes,
Raphaël

Statistics: Posted by Raphaël Nunes — Mon Jul 15, 2024 11:44 am


]]>
2024-07-11T13:05:14-05:00 2024-07-11T13:05:14-05:00 https://opensourceinstruments.com:443/Forums/viewtopic.php?t=30&p=106#p106 <![CDATA[Re: reading NDF files in python]]>
"If I understand correctly the pyecog suite may not be suitable for loading
the NDF files recorded with the AL systems so we will refrain from
attempting to use it and employ another strategy."

A am confident that Marco Leite will be happy to add support for the ALT. I am going to send him a link to this discussion.

"Which language did you use to code the exporter function of neuroplayer? Would you be keen in sharing your code that we could use as a basis?"

I use Pascal for anything that has to run fast, see link in previous post.

"Indeed the exporter function of the neuroplayer is not very practical for
our use since when exporting multiple NDF files together. We need to do
this separately for different batches of a given recording. i.e. we record
an animal for one week, we export the data, in the meanwhile we keep
recording, and one week later we need to export the second week of
recording."

If I were you, I would export as you are recording, so the export file is always ready to use.

"The exporter function creates an overlap between the resulting EDF binary
files and this overlap is not always the same. Therefore when concatenating
signals from different batches we are not able to infer how much to cut on
each side of the files (the last one of batch 1 and the first one of batch
2)."

How big is the overlap?

How long are your individual NDF files?

Do you have the Synchronize button checked in the Neuroplayer when you record?

How long is the overlap? Is it seconds, minutes, or tens of minutes?

"Maybe we are missing something that already allows us to do so? Is this
overlap necessary for the conversion process?"

There is no overlap necessary in the conversion process. But there is always the issue of clocks running at different speeds. The ODR, TCB, and ALT all use a 1 ppm temperature-compensated clock, so it will be correct to +-0.6 s per week. But your computer clock will not be as accurate. If you have your computer connected to the Internet, and it is correcting its clock using Network Time, then your computer will drift by no more than +-5 s during the week. On average, the computer clock will be exactly correct.

How should we handle the disagreement between the computer and the telemetry receiver? There are two ways. One way is we trust the receiver clock, and use only the receiver clock. When recording one-hour NDF files, we record one hour of receiver data, start a new NDF file, and give the new file a timestamp that is 3600 s after the timestamp of the previous file. There is no overlap. There is no loss of data. After eight weeks, the recording may be wrong by 5 s. This is what happens if you un-check the Synchronize button in the Neurorecorder.

The second way is to synchronize the NDF files each time we make a new one. We reset the telemetry receiver clock so that it agrees with the computer clock, and begin a new recording at the start of a new computer clock second. We give the file a time stamp equal to the computer time. If the computer clock runs faster than the receiver clock, the NDF files names will be separated by more than 3600 s. If the computer clock runs slower, they will be separated by less than 3600 s. The sad thing about this method is: when we reset the telemetry receiver, we will lose some data, perhaps a half-second of data. This is what happens if you check the Synchronize button in the Neurorecorder.

When we have Animal Cage Cameras (ACCs) recording synchronous video, we have to check Synchronize so the telemetry receiver and the cameras can stay on the same clock as the computer. Otherwise, you can use the telemetry receiver as your reference clock.

Best Wishes, Kevan

Statistics: Posted by Kevan Hashemi — Thu Jul 11, 2024 2:05 pm


]]>
2024-07-11T12:47:44-05:00 2024-07-11T12:47:44-05:00 https://opensourceinstruments.com:443/Forums/viewtopic.php?t=30&p=105#p105 <![CDATA[reading NDF files in python]]>


This repository is maintained by Marco Leite of ION/UCL. As yet, there is no user-manual and no help page. The routine for importing into the PyEcog program works with recordings made by Octal Data Receivers (ODRs) and Telemetry Control Boxes (TCBs). So far as we know, it has not been configured for Animal Location Trackers (ALTs). The only difference between the messages recorded by these three receivers is the size of the payload attached to each message. The ODR has 0 bytes of payload, the TCB 2 bytes, and the ALT 16 bytes. Enhancing PyEcog for ALT data would be trivial, and we are happy to ask Marco to do so.

We describe the NDF format here:



We describe the telemetry message format here:



The code we use to reconstruct telemetry signals that have been affected by signal loss, interference, and message duplication are in the lwdaq_sct_receiver routine here:



The Neuroplayer.tcl, Neurorecorder.tcl, and Receiver.tcl files are all available on our GitHub repository, and in your own LWDAQ distribution.

Best Wishes, Kevan

Statistics: Posted by Kevan Hashemi — Thu Jul 11, 2024 1:47 pm


]]>