PSoC™ 5, 3 & 1 Forum Discussions
I am getting an error of analog routing solution when i try to connect Amux,Opamp and Adc.
can anyone help me to solve this error?
I am attaching my workplace over here.
Thanks,
Show LessI am using 5LP and need 1.024V Vref in the schematic.
Because all 4 Comparators are used, I am not allowed to "AutoEnable" the 1.024V Vref, the Datasheet says Vref needs to be manually enabled in analog block.
But I cannot find the instructions for how to manually enable Vref.
Can someone help me on this?
Thanks.
Show LessHello,
I'm using this kit for a learning project and it occurs that it could be very useful for a particular task.
I need to count pulses on 8 lines; the pulses can be set to be 0.1 or 1 microsecond wide, but the gap between them is variable. It would be good if the gap could be as small as 1 microsecond. The pulses are coming at random times on all 8 lines. Every 20 ms or so, I need to read the counters, do a little math, and send a signal on an output line.
I also need to connect several of these systems to a PC to load configurations such as counting time. Ethernet would be good, but RS-485 should also work
In the IDE, I have configured 8 Basic Counters and the compilation succeeded but I haven't tested anything yet. Before I go on, I'd like to know if this part is a good fit for the application or if I should upgrade. I'd like to stay with the KIT style for at least the prototypes.
Thanks,
John
Show Less1. When I use buffer greater than 4 bytes, is your interrupt routine fires upon arrival of the first byte and move that byte to the RAM buffer thus applying byte-by-byte interrupt based transfer of bytes from the FIFO to the RAM?
Or, is it waiting for 4 bytes (thus stalling my program until all 4 bytes received) and operate as a 4 bytes interrupt based transfer of bytes from the FIFO to the RAM? in that case the "Byte Received interrupt" fires every 4 bytes for a long message?
2. If i want to use the On Byte Received interrupt as well, which will run first, your routine or my (please refer to both modes of using interrupts- in the interrupt C file and in the main)?
3. Please describe 5 bytes transfer delay times:
how long do i have to wait between the time the interrupt fires and the time the first byte readable in the RAM buffer? how long do i have to wait for the next byte to be ready in the RAM buffer and so on.
4. Can i create a counter which will fire an interrupt after n bytes received in the RAM buffer? (how do I wire such counter to the UART?)
5. Is there an option to create a "FIFO empty interrupt" for RX?
6. Is it possible to create UART with more than 4 bytes HW buffer (so my SW routine will not be interrupted until i want to check the buffer)?
7. Just to clear- all functions (both TX and RX) are going for the FIFO buffer in case of under 4 bytes buffer, and for the RAM buffer only (ignoring the existence of a FIFO buffer) for buffers over 4 bytes. please confirm.
Thanks in advance,
Asafya
Show LessHi,
I wanted to get advise on which component to use with a datagram as shown below? UART component shows 8 (default) up to 9 data bits. And in the advanced tab for the TX buffer and RX buffer default is set to 4 bytes.
How should one configure the UART component for the this datagram?
Thanks for the help!
Oz
Show LessHi to everyone and thanks in advance.
Here is my question.
As I can see in page 2 of ADC_SAR reference sheet.
My question is, how could I be sure that the signals are synchronized? I will try to explain myself.
I got this:
As can seen in the above image, I have an ADC_SAR it is controlled by a 1Khz (A) input signal in "soc" and in the ADC_SAR´s configuration I have a clock frequency of 1777.778. which gives me one isr_ADC (B) output signal each 1uS.
If I understood correctly, in the rising edge of the 1khz signal A) the ADC_SAR start to convert the input and after a few micro-seconds I got the B) (isr_ADC). and this happens JUST ONE TIME in each 1khz rising edge, doesn´t it?
my doubt is :
1) the condition of : "least one ADC_SAR clock cycle wide", =====> it ok.
2) "This signal should be synchronized to the ADC_SAR clock" =====> how could I be sure this is correct with the setting is shown above? do I need to add something?
I saw a strange behavior in the isr_ADC output (using an oscilloscope), in the period of 1 ms (1 khz in A) ) time to time I could see two (or more) interrupts followed one another (at different distances) .
why can this happen?
this is more or less the idea.
I hope some of you can explain to me, how can I set the ADC correctly?.
Thank you very much for your time.
Regards.
Show LessHi,
I want to interface a sdcard with the psoc5lp to store data coming in at 1Msps.
I want to do it without using the core (DMA transfers). I would also like to write to a file so that it would be easier to view the data on the laptop and process it further. PSoC creator does have the emFile component. I am unclear as to how to write to a file using DMA while using the emFile component. The emFile example given by PSoC uses the core to write values into a text file. I would like to do that using the DMA. Can someone please point me to some example code and give me some helpful pointers on how to implement this.
Show LessHello, i have problem with USBUART. Im currently working with CY8C5888LTI-LP097 kit. MCU is powered with 3.3 V so I set up in system settings all voltage to 3.3 V. Picture below.
I also set USBUART voltage to 3V and removed D2 in VBUS line to prevent getting 5V from USB.
Now when i try to open COM port it says that there is error. See picture below.
Any idea what is causing problem?
Show LessHi,
I have a few questions regarding IDACs in PSoC 5LP CY8CKIT-059:
- In datasheet it's stated that the maximum DAC sample rate is 8 MSps and that the settling time to 0.5 LSB is 125 ns for 255 μA range for full scale transition, DACs running in fast mode and using 600 Ω 15-pF load.
- What happens if I strobe the DAC with frequency higher than 8 MHz? I assume that the DAC will work fine, but it won't be guaranteed that the output will settle to 0.5 LSB until DAC is strobed again. Is settling time for something like half scale transition lower than for full scale transition and is it affected by the load attached?
- How are DACs trimmed? I'm using 4 IDACs in source mode and would like to get the as close to ideal as possible, but it's even more important to get them as close to each other as possible.
- I found this question already answered in this thread, but it did not work for me:
A sample Calibration routine does the following:
The goal is to adjust the calibration code to get 256uA from the IDAC when the input code is 255 in the mid-range:
- Fix the digital input code to 255 and the calibration code to 128 (128 = 0b10000000 is the default value) and capture the DAC output.
- Determine the gain error.
- Apply the correct calibration code and capture the DAC output.
The default value of the Cal[7:0] is [10000000]. Values lower than this will decrease the gain and values greater than this will increase it.
- When I increase the calibration code (for example 10000111) output current is decreased, but if I understood this correctly it should actually increase.
What am I doing wrong and what would be the best way to trim all 4 DACs to get them as close to each other as possible?
Is there any specific way that this measurement should be done? I tried connecting an ammeter directly between DAC output pins and ground and also through a 1k and 3.7k resistors, but it always behaves the same way.
This is an example of how I'm trimming the DACs in code:
- IDAC_0_trim_val = 0b10000111;
- CY_SET_REG8(IDAC8_0_TR_PTR, IDAC_0_trim_val);
3. I've found a DAC Block Test Register in PSoc5LP_Registers_TRM, but I can't find any application notes or discussions about it. How does it work and how and when should it be used?
Any help would be appreciated.
Show Less