TX_Labview_MATLAB block is a powerful tool to data logging.
But, thinking about the software integration, I would also like to data log to LabVIEW.
So my questions here are:
1) Is there any LabVIEW code which can interface to the TX_Labview_MATLAB block?
2) If there isn't such code left, is it possible to know the data format?
I seem to find it in the following URL, but I cannot read french unfortunetely.
http://lubink.free.fr/projelectro/Proto ... ew_FR.html
Thanks,
Masaki
Data Logging to LabVIEW
Re: Data Logging to LabVIEW
Hi Lubin,
At the same time, I've tried to figure our your French home page, where you wrote about the
protocole. I understand that all data from "TX_LabVIEW_MATLAB" block have following headers:
bits 7-4: Number of the channel (0 to 15)
bits 3-2: type of data corresponding to the number of bytes composing the data (1 to 4)
bits 1-0: signature of the byte controls = "01"
Is this understanding correct?
If it's so, my next question is what happens if data happen to have confusing values
resembling to header format. For example, in binary:
00000101 00010101 00020101 00000101 00010101 01000101 .....
I wrote these bytes in terms of int16 data from channel 0.
But they can also be interpreted as int 16 data from channel 1.
How does your algorithm in "interface Tx-MATLAB" block distinguish the two cases?
Thanks,
Masaki
Based on your LabVIEW program, I have made my interfacing program.Anyway, I attach old labview files. I provide it "in its actual state".
I have not installed labview on my computer right now.
At the same time, I've tried to figure our your French home page, where you wrote about the
protocole. I understand that all data from "TX_LabVIEW_MATLAB" block have following headers:
bits 7-4: Number of the channel (0 to 15)
bits 3-2: type of data corresponding to the number of bytes composing the data (1 to 4)
bits 1-0: signature of the byte controls = "01"
Is this understanding correct?
If it's so, my next question is what happens if data happen to have confusing values
resembling to header format. For example, in binary:
00000101 00010101 00020101 00000101 00010101 01000101 .....
I wrote these bytes in terms of int16 data from channel 0.
But they can also be interpreted as int 16 data from channel 1.
How does your algorithm in "interface Tx-MATLAB" block distinguish the two cases?
Thanks,
Masaki
-
- Site Admin - Expert
- Posts: 616
- Joined: Wed Mar 07, 2007 11:23 pm
- Location: Bayonne- France
- Contact:
Re: Data Logging to LabVIEW
Hi Masaki,
However, in most case, data will not always have their two lowest bytes set to 01. If few data do not finish by 01, the labview algorithm will converge after a few data to the correct solution.
If labview is started before the dsPIC started to send data, there will be no error. Errors may occur only when you hot plug the serial line or start labview while dsPIC is sending data.
Note that the matlab script is even more powerful that the labview algorithm because it extract the solution which is the best, considering the whole data frame received at time T. In this way, it is optimal because it use all the information available. The labview algorithm is suboptimal because it analyses the data frame byte after byte in chronological order and takes on the fly decisions.
(dataframe is the set of data read from the COM port. It may contain 0 to several hundred or thousand of bytes)
The labview algorithm that Masaki is using will be published soon on the website.
Lubin
Correct !Masaki wrote:bits 7-4: Number of the channel (0 to 15)
bits 3-2: type of data corresponding to the number of bytes composing the data (1 to 4)
bits 1-0: signature of the byte controls = "01"
It is not possible to distinguish here between the two possibilities on this example.Masaki wrote:If it's so, my next question is what happens if data happen to have confusing values
resembling to header format. For example, in binary:
00000101 00010101 00020101 00000101 00010101 01000101 .....
I wrote these bytes in terms of int16 data from channel 0.
But they can also be interpreted as int 16 data from channel 1.
However, in most case, data will not always have their two lowest bytes set to 01. If few data do not finish by 01, the labview algorithm will converge after a few data to the correct solution.
If labview is started before the dsPIC started to send data, there will be no error. Errors may occur only when you hot plug the serial line or start labview while dsPIC is sending data.
Note that the matlab script is even more powerful that the labview algorithm because it extract the solution which is the best, considering the whole data frame received at time T. In this way, it is optimal because it use all the information available. The labview algorithm is suboptimal because it analyses the data frame byte after byte in chronological order and takes on the fly decisions.
(dataframe is the set of data read from the COM port. It may contain 0 to several hundred or thousand of bytes)
The labview algorithm that Masaki is using will be published soon on the website.
Lubin
Who is online
Users browsing this forum: No registered users and 1 guest