
Timer won't trigger. What's wrong!
You haven't said what sort of Timer control you are using and you haven't
shown your "MilliTimer" code, so without further details from you it is a
bit hard to discover the problem. If you are using the standard VB Timer
Control then you need to be aware that any code which "ties up" your thread
(such as your current Do . . . Loop code) will also tie up the Timer
(except, of course, if you explicitly permit events to occur within the loop
using DoEvents). So, you should leave the DoEvents in, which should cause
your Timer to continue to fire. However, you may be doing something in your
"MilliTimer" code that is upsetting things. Post that code so that we can
have a look at it.
In any case, your current method (calling a routine that "sits in a closed
loop" until either an input from your sensor has arrived or until five
seconds has elapsed) seems to be a very wasteful method of doing what you
appear to want to do (unless, of course, you have some very specific reason
for wanting to do it that way). If you really do want to do it that way,
though, then you don't need a Timer at all. Instead, simply check the
elapsed time in your current Do . . . Loop.
By the way, your Timer will only run at your specified Interval (100
milliseconds, in your case) if the code in its Timer Event takes less than
that amount of time to execute, so check out that side of things. It is
quite possible that your current "Timer Event" code is itself getting bogged
down waiting for a response from your serial port, and if that is the case
then no further Timer events will fire. In fact, without further detail of
what you are doing, I think that is the most likely cause of your problem.
Also, if it is the standard VB Timer then the Interval you actually get will
not usually be the Interval you have requested. Moreover, this difference
(requested Interval versus achieved Interval) depends on the version of
Windows on which your code is running. In WinNT, for example, it will be the
next available multiple of 10 milliseconds and on Win98 it will be the next
available multiple of 55 milliseconds.
Post some more details showing more of your code and explaining in more
detail what you are actually doing.
Mike
Quote:
> This is a Timer Control related question.
> I have a timer running that looks for input from the serial port
> (like, reads the temperature) and updates a plot (temperature vs time).
> I also have a way of sending a query out through the serial port and
> the temperature instrument sends back an answer embedded in the usual
> stream of data that it spits out.
> The timer routine is able to pluck out the query response and set a
> global flag to alert the program the query answer has arrived.
> My routine for performing with query is..........
> function SendQuery( CommandString as string) as boolean
> SICOMMPutString CommandString ' Send out the query command
> MilliTimer True 'Reset a timeout timer (argument of FALSE sends
> back the elapsed time)
> Do
> DoEvents
> '
> 'A timeout flag... to abort the loop if the response takes more
> than 5 seconds
> if MilliTimer(false)>5 then
> i=1
> else
> i=0
> end if
> '
> 'ResponseFlag is a global set to TRUE (in the timer routine) if
> the response is found in the input data stream
> Loop Until i=1 or ResponseFlag
> If i=1 then
> SendQuery = false
> else
> SendQuery = true
> end if
> end function
> BUT the problem is, timer stops getting triggered during the do-loop:
> even taking out the DoEvents.
> The timer has an interval of 100 milliseconds. I thought it was
> supposed to run every 0.1 seconds no matter what my application is
> doing.
> What am I doing wrong? Why does the timer fail to trigger?