100 baud

Coyoteboy

Senior Member
Is it possible for me to get an 08M to communicate at 100 baud? My hardware is especially slow and im struggling to find software for a PC that can adopt 100 baud - they all seem to be 110 only and im not that good at writing C to do my own LOL. Figured it might be easier to use a picaxe as a sort of buffer
 

Technical

Technical Support
Staff member
Not easily no.

You could try underclocking the 08M to 31kHZ and then using the 1200 baud setting but there is still a bit of error in the freq. rate. However depending on the third party system it may work.
 

Coyoteboy

Senior Member
Ahh right - i had thought of underclocking it but wasnt sure how low i could go without problems. The other hardware is a rather ham-fisted serial datastream from my engine control unit, and all i have is information that its 100 baud and a fixed repeating set of byte variables. I've constructed an interfacign circuit for a PC using a MAX232 IC so I'll have a crack at writing my own setup.

J
 

manuka

Senior Member
Serial slowdown, by 08M underclocking, works very well to the extent that individual bits can even be heard Morse style ! Naturally BOTH Rx & TX have to be underclocked the same.
 

premelec

Senior Member
Does the clocking revert to proper programing clocking when you turn the power off - i.e. must it be called in the software and is volatile...? Thanks
 

manuka

Senior Member
Yes - reverts to normal 4MHz speed when powered down. That had me worried too, as I'd visions of being locked in slow coach limbo! Stan
 

womai

Senior Member
At that speed (100 baud) bit-banging the serial data stream should work reasonably well - provided you have an oscilloscope to tweak the exact delays. That would also offer the possibility of implementing non-blocking serial input... Since it's so slow, one of the free "soft scopes" that use the PC's audio input would be enough.

Wolfgang
 

manuka

Senior Member
You can almost do it by ear at the slowest baud rate of ~ 32kHz)! That's 1/120th of the normal 4MHz, so 2400bps slows down to something like 20 bps, & 300bps to ~ 2.5 bps. Such lethargic rates have much scope when noise & weak signals otherwise hinder comms - a field many radio hams have explored recently. Try a Google for "fuzzy modes"

I'd considered IR underclocking as well, since the 38kHz raw data is too fast for modulation etc onto a wireless stream. Naturally the reciever IR module still has a 38kHz passband however
 

hippy

Ex-Staff (retired)
I think Wolfgang has the best idea; bit-banging. 100 baud has a 10mS bit-time which is a long time. I'd run at 8MHz to give maximum processing speed for checking byte framing and dealing with the data in the stop bits etc, and data can then be sent to the PC at 9600 baud using SERTXD, minimising the time wasted before gettng back to look for the next byte's start bit.

As long as the PICAXE's sampling is +/-5% accurate there shouldn't be any problems. PULSOUT can be used to get accurate timing; use it on a pin with a LED connected and it will show when data is being received.

A trick with keeping delays and timing constant having sorted that out is to put such routines at the start of a program and skip round that code at reset. As the main program code changes, the bit-banging routines won't get re-aligned in memory with timing getting altered ...

- GOTO ActualStart
-
- BitBangedRoutine:
- :
- RETURN
-
- ActualStart:
- :
- GOSUB BitBangedRoutine
- :
- END

It should also be possible to use PULSIN to measure incoming bit times. High or low going periods should all have nominal lengths in multiples of 10mS so it may be possible to calculate any bit sampling delays required automatically.

If the data bytes are sent back-to-back, it will probably be necessary to do some fiddling to synchronise framing if the PICAXE turns on and starts reading halfway through a transmitted byte.

Edited by - hippy on 1/24/2006 8:48:35 AM
 

hippy

Ex-Staff (retired)
Untested framework code for reading 100 baud and sending it out to a PC ...

http://homepage.ntlworld.com/the.happy.hippy/picaxe/rxbyte.txt
 

Coyoteboy

Senior Member
wow, hippy - thanks for that - saved me a LOT of time, I'll give that a try ASAP and see if I track down any problems!

Lots of interesting reading here! Never thought this would bring up so much interest!

James
 

hippy

Ex-Staff (retired)
It's an interesting challenge :)

Yours is going to be in trying to work out what actual delay values are needed to perform the sampling at the right time. A scope would undoubtedly help ( the PULSOUT can be viewed to see when the delays are occuring ) and geting it to work at 110 baud is probably a good start. That way you can test by sending characters from a PC Terminal program and check they come back, then adjust the delays for 100 baud. See my other post for bit-banged 110 baud output.
 

hippy

Ex-Staff (retired)
Definitely a challenge, but I've managed to get the code working at 110 baud and the bugs removed. Program has been updated, same link as earlier. Wire it up and use it as a loop-back for characters sent from a Terminal program to test. Implemented on an 08 but should work on an 08M. Corruption will occur if you type too quickly because it won't be ready for the next character when it's sent.

Converting to 100 baud should be possible by extrapolating from the values for the delays I used for 110 baud ...

FIRST_BIT_DELAY is 1068 (10680uS) and the time from start bit to mid first bit is 13.636mS (1.5*1s/110), thus the software overhead is 13636us minus 10680uS = 2956uS. The time from start bit to mid first bit at 100 baud should be 15mS. So the delay has to be 15000uS minus 2956uS = 12044uS, in intervals of 10uS that gives a FIRST_BIT_DELAY value of about 1204.

Likewise INTER_BIT_DELAY ( ~9mS bit time at 110 baud, 10mS at 100 baud ) becomes about 476.

You'd best verify my maths, and you may still need to do some tweaking to the values. Use the same technique to adjust the values for 8MHz 08M operation.

Have fun !
 
Top