Is Input$ Really THAT Slow? 
Author Message
 Is Input$ Really THAT Slow?

I my quest of trying to copy large files over slow networks, I thought I'd
try downloading just a small bit of the file, and extrapolate from that how
long it will take. My code is simply:

Function GetDownloadTime(ByVal percent As Double) As Double
   Dim a$, FF%, , filesize&, sample&, gt1&, gt2&

   FF% = FreeFile()
   filesize& = FileLen(ColDir$)
   sample& = filesize& * percent / 100
   gt1& = GetTickCount&
   Open ColDir$ For Binary As #FF%
   a$ = Input$(sample&, FF%)
   Close FF%
   gt2 = GetTickCount&
   GetDownloadTime = (gt2& - gt1&) * CDbl(filesize&) / sample& / 1000

End Function

By putting in 100%, to download the whole file, for the file I'm downloading
from the LAN, which is about 5.5 MB in size, it takes about 6 seconds using
the code above. 6 FULL SECONDS! That was a huge shock, so I went and timed
my current code to read in the same file from the same location: 0.3
Seconds!

My normal routine to read it in the same data can be summarized as:

   Do While Not EOF(FF%)
      Input #FF%, a$
      If a$ > "" Then
         If a$ = "end of file" Then
            gotit% = True
         Else
            x& = x& + 1
            DwgPath(x&) = a$
            DwgName(x&) = GetNumber(DwgPath(x&))
            'BaseNumber(x&) = GetBase(DwgName(x&))
         End If
      End If
   Loop

Notice that each of the approximately 150,000 lines is read in separately
using the Input# statement, the variable is then separated into three
arrays, the full path/name, one with the path, and another with the base
name. GetNumber and GetBase are obviously external functions.

Despite all this, this code executes in 0.3 seconds. (And included in the
0.3 seconds is a loop which goes back through and processes each item
again.) While the function code above, reading the same file from the same
location takes 20 times longer!

Can it be that Input# statement is THAT much faster Input()? I would think
that reading the entire file into a variable at one shot would be the
fastest method possible.

What's going on here??

--
Regards,

Rick Raisley
heavymetal-A-T-bellsouth-D-O-T-net



Sun, 16 Jan 2011 21:51:34 GMT  
 Is Input$ Really THAT Slow?


Quote:
>I my quest of trying to copy large files over slow networks, I thought I'd
> try downloading just a small bit of the file, and extrapolate from that
> how
> long it will take. My code is simply:

Try using Get and reading into a byte array and see how that does; something
like:

Function GetDownloadTime(ByVal FilePath As String, ByVal percent As Double)
As Double
Dim b() As Byte
Dim ff As Long
Dim filesize As Long
Dim sample As Long
Dim gt1, gt2 As Long
ff = FreeFile
filesize = FileLen(FilePath)
If filesize Then
  sample = filesize * percent / 100
  If filesize < 1 Then filesize = 1
  ReDim b(1 To filesize)
  gt1 = GetTickCount
  Open FilePath For Binary As #ff
  Get #ff, , b
  Close #ff
  gt2 = GetTickCount
  GetDownloadTime = (gt2 - gt1) * CDbl(filesize) / sample / 1000
End If
End Function

(sorry, just can NOT use those ugly type declaration characters all through
the code!)



Sun, 16 Jan 2011 22:59:29 GMT  
 Is Input$ Really THAT Slow?

Quote:
>I my quest of trying to copy large files over slow networks, I thought I'd
> try downloading just a small bit of the file, and extrapolate from that
> how
> long it will take. My code is simply:

> Function GetDownloadTime(ByVal percent As Double) As Double
>   Dim a$, FF%, , filesize&, sample&, gt1&, gt2&

>   FF% = FreeFile()
>   filesize& = FileLen(ColDir$)
>   sample& = filesize& * percent / 100
>   gt1& = GetTickCount&
>   Open ColDir$ For Binary As #FF%
>   a$ = Input$(sample&, FF%)
>   Close FF%
>   gt2 = GetTickCount&
>   GetDownloadTime = (gt2& - gt1&) * CDbl(filesize&) / sample& / 1000

> End Function

> By putting in 100%, to download the whole file, for the file I'm
> downloading
> from the LAN, which is about 5.5 MB in size, it takes about 6 seconds
> using
> the code above. 6 FULL SECONDS! That was a huge shock, so I went and timed
> my current code to read in the same file from the same location: 0.3
> Seconds!

> My normal routine to read it in the same data can be summarized as:

>   Do While Not EOF(FF%)
>      Input #FF%, a$
>      If a$ > "" Then
>         If a$ = "end of file" Then
>            gotit% = True
>         Else
>            x& = x& + 1
>            DwgPath(x&) = a$
>            DwgName(x&) = GetNumber(DwgPath(x&))
>            'BaseNumber(x&) = GetBase(DwgName(x&))
>         End If
>      End If
>   Loop

> Notice that each of the approximately 150,000 lines is read in separately
> using the Input# statement, the variable is then separated into three
> arrays, the full path/name, one with the path, and another with the base
> name. GetNumber and GetBase are obviously external functions.

> Despite all this, this code executes in 0.3 seconds. (And included in the
> 0.3 seconds is a loop which goes back through and processes each item
> again.) While the function code above, reading the same file from the same
> location takes 20 times longer!

> Can it be that Input# statement is THAT much faster Input()? I would think
> that reading the entire file into a variable at one shot would be the
> fastest method possible.

> What's going on here??

Did you reboot your computer before you tried your "faster" code? If not, it
is possible that the file you read in the first time was being read directly
out of your computer's (download) buffer when you executed your "faster"
code and not be being re-read across the network.

Rick



Sun, 16 Jan 2011 23:01:49 GMT  
 Is Input$ Really THAT Slow?


Quote:
> >I my quest of trying to copy large files over slow networks, I thought
> >I'd
>> try downloading just a small bit of the file, and extrapolate from that
>> how
>> long it will take. My code is simply:

>> Function GetDownloadTime(ByVal percent As Double) As Double
>>   Dim a$, FF%, , filesize&, sample&, gt1&, gt2&

>>   FF% = FreeFile()
>>   filesize& = FileLen(ColDir$)
>>   sample& = filesize& * percent / 100
>>   gt1& = GetTickCount&
>>   Open ColDir$ For Binary As #FF%
>>   a$ = Input$(sample&, FF%)
>>   Close FF%
>>   gt2 = GetTickCount&
>>   GetDownloadTime = (gt2& - gt1&) * CDbl(filesize&) / sample& / 1000

>> End Function

>> By putting in 100%, to download the whole file, for the file I'm
>> downloading
>> from the LAN, which is about 5.5 MB in size, it takes about 6 seconds
>> using
>> the code above. 6 FULL SECONDS! That was a huge shock, so I went and
>> timed
>> my current code to read in the same file from the same location: 0.3
>> Seconds!

>> My normal routine to read it in the same data can be summarized as:

>>   Do While Not EOF(FF%)
>>      Input #FF%, a$
>>      If a$ > "" Then
>>         If a$ = "end of file" Then
>>            gotit% = True
>>         Else
>>            x& = x& + 1
>>            DwgPath(x&) = a$
>>            DwgName(x&) = GetNumber(DwgPath(x&))
>>            'BaseNumber(x&) = GetBase(DwgName(x&))
>>         End If
>>      End If
>>   Loop

>> Notice that each of the approximately 150,000 lines is read in separately
>> using the Input# statement, the variable is then separated into three
>> arrays, the full path/name, one with the path, and another with the base
>> name. GetNumber and GetBase are obviously external functions.

>> Despite all this, this code executes in 0.3 seconds. (And included in the
>> 0.3 seconds is a loop which goes back through and processes each item
>> again.) While the function code above, reading the same file from the
>> same
>> location takes 20 times longer!

>> Can it be that Input# statement is THAT much faster Input()? I would
>> think
>> that reading the entire file into a variable at one shot would be the
>> fastest method possible.

>> What's going on here??

> Did you reboot your computer before you tried your "faster" code? If not,
> it is possible that the file you read in the first time was being read
> directly out of your computer's (download) buffer when you executed your
> "faster" code and not be being re-read across the network.

No, I didn't reboot. But I did alternate back and forth between the two
functions, with consistent results. I realize also that although the
statement used to input the data is different, also one is opened as Binary,
the other as text. Could that be the difference?

--
Regards,

Rick Raisley
heavymetal-A-T-bellsouth-D-O-T-net



Sun, 16 Jan 2011 23:16:05 GMT  
 Is Input$ Really THAT Slow?


Quote:
>I my quest of trying to copy large files over slow networks, I thought I'd
> try downloading just a small bit of the file, and extrapolate from that
> how
> long it will take. My code is simply:

> Function GetDownloadTime(ByVal percent As Double) As Double
>   Dim a$, FF%, , filesize&, sample&, gt1&, gt2&

>   FF% = FreeFile()
>   filesize& = FileLen(ColDir$)
>   sample& = filesize& * percent / 100
>   gt1& = GetTickCount&
>   Open ColDir$ For Binary As #FF%
>   a$ = Input$(sample&, FF%)
>   Close FF%
>   gt2 = GetTickCount&
>   GetDownloadTime = (gt2& - gt1&) * CDbl(filesize&) / sample& / 1000

> End Function

Seems a convoluted method, try something like this:
(Warning: Unchecked air code!)

ff = freefile
open ColDir$ for binary as ff
a$ = space(LOF(ff))
get ff,,a
Close ff

I think your bottleneck MIGHT be the time VB takes to create the string a, I
suspect your method above builds it up line by line which is inherently
slow, the method I show creates a blank string of the correct length before
it tries to read the file.

Regards
Dave O.



Sun, 16 Jan 2011 23:24:01 GMT  
 Is Input$ Really THAT Slow?

Quote:



>>I my quest of trying to copy large files over slow networks, I thought I'd
>> try downloading just a small bit of the file, and extrapolate from that
>> how
>> long it will take. My code is simply:

> Try using Get and reading into a byte array and see how that does;
> something like:

> Function GetDownloadTime(ByVal FilePath As String, ByVal percent As
> Double) As Double
> Dim b() As Byte
> Dim ff As Long
> Dim filesize As Long
> Dim sample As Long
> Dim gt1, gt2 As Long
> ff = FreeFile
> filesize = FileLen(FilePath)
> If filesize Then
>  sample = filesize * percent / 100
>  If filesize < 1 Then filesize = 1
>  ReDim b(1 To filesize)
>  gt1 = GetTickCount
>  Open FilePath For Binary As #ff
>  Get #ff, , b
>  Close #ff
>  gt2 = GetTickCount
>  GetDownloadTime = (gt2 - gt1) * CDbl(filesize) / sample / 1000
> End If
> End Function

Incredible. Simply using the Get statement speeds it up by over 100 times! I
also tried it without the array, using m a$, as:

a$ = Space$(sample&)
Get #FF, , a$

and it was almost as fast. I switched back and forth between the various
methods multiple times with consistent results. Very surprising.

So, using var = Input$() is BAAADD! Yet, Input var works pretty darn well.
Who'd have thunk it!! ;-)

Guess I'll have to search all of my projects for "= Input" and convert them
over!  lol!

Quote:
> (sorry, just can NOT use those ugly type declaration characters all
> through the code!)

Sorry. I can't seem to get away from them. I feel I just HAVE to know what a
variable is, and really dislike preceding a logical name with those silly
prefixes. Now, which was that? dblProfit? sngProfit? decProfit? curProfit?

--
Regards,

Rick Raisley
heavymetal-A-T-bellsouth-D-O-T-net



Sun, 16 Jan 2011 23:33:19 GMT  
 Is Input$ Really THAT Slow?
Actually, I don't use the Input method, so I can't speak fully for or
against it. My preference is something like this (don't forget to declare
the variables)...

  FileNum = FreeFile
  Open "C:\TEMP\Test.txt" For Binary As #FileNum
    TotalFile = Space(LOF(FileNum))
    Get #FileNum, , TotalFile
  Close #FileNum

which has always worked blinding fast for me, although I can't speak for its
speed across a network connection (which I would guess is slower than local
storage do to connecting time issues) as my experience with networks has
been minimal across the years. I think the above method also avoids some
end-of-file character issues that I think I recall Input may be subject to.

Rick


Quote:


>> >I my quest of trying to copy large files over slow networks, I thought
>> >I'd
>>> try downloading just a small bit of the file, and extrapolate from that
>>> how
>>> long it will take. My code is simply:

>>> Function GetDownloadTime(ByVal percent As Double) As Double
>>>   Dim a$, FF%, , filesize&, sample&, gt1&, gt2&

>>>   FF% = FreeFile()
>>>   filesize& = FileLen(ColDir$)
>>>   sample& = filesize& * percent / 100
>>>   gt1& = GetTickCount&
>>>   Open ColDir$ For Binary As #FF%
>>>   a$ = Input$(sample&, FF%)
>>>   Close FF%
>>>   gt2 = GetTickCount&
>>>   GetDownloadTime = (gt2& - gt1&) * CDbl(filesize&) / sample& / 1000

>>> End Function

>>> By putting in 100%, to download the whole file, for the file I'm
>>> downloading
>>> from the LAN, which is about 5.5 MB in size, it takes about 6 seconds
>>> using
>>> the code above. 6 FULL SECONDS! That was a huge shock, so I went and
>>> timed
>>> my current code to read in the same file from the same location: 0.3
>>> Seconds!

>>> My normal routine to read it in the same data can be summarized as:

>>>   Do While Not EOF(FF%)
>>>      Input #FF%, a$
>>>      If a$ > "" Then
>>>         If a$ = "end of file" Then
>>>            gotit% = True
>>>         Else
>>>            x& = x& + 1
>>>            DwgPath(x&) = a$
>>>            DwgName(x&) = GetNumber(DwgPath(x&))
>>>            'BaseNumber(x&) = GetBase(DwgName(x&))
>>>         End If
>>>      End If
>>>   Loop

>>> Notice that each of the approximately 150,000 lines is read in
>>> separately
>>> using the Input# statement, the variable is then separated into three
>>> arrays, the full path/name, one with the path, and another with the base
>>> name. GetNumber and GetBase are obviously external functions.

>>> Despite all this, this code executes in 0.3 seconds. (And included in
>>> the
>>> 0.3 seconds is a loop which goes back through and processes each item
>>> again.) While the function code above, reading the same file from the
>>> same
>>> location takes 20 times longer!

>>> Can it be that Input# statement is THAT much faster Input()? I would
>>> think
>>> that reading the entire file into a variable at one shot would be the
>>> fastest method possible.

>>> What's going on here??

>> Did you reboot your computer before you tried your "faster" code? If not,
>> it is possible that the file you read in the first time was being read
>> directly out of your computer's (download) buffer when you executed your
>> "faster" code and not be being re-read across the network.

> No, I didn't reboot. But I did alternate back and forth between the two
> functions, with consistent results. I realize also that although the
> statement used to input the data is different, also one is opened as
> Binary, the other as text. Could that be the difference?

> --
> Regards,

> Rick Raisley
> heavymetal-A-T-bellsouth-D-O-T-net



Sun, 16 Jan 2011 23:43:44 GMT  
 Is Input$ Really THAT Slow?

Quote:



>>I my quest of trying to copy large files over slow networks, I thought I'd
>> try downloading just a small bit of the file, and extrapolate from that
>> how
>> long it will take. My code is simply:

>> Function GetDownloadTime(ByVal percent As Double) As Double
>>   Dim a$, FF%, , filesize&, sample&, gt1&, gt2&

>>   FF% = FreeFile()
>>   filesize& = FileLen(ColDir$)
>>   sample& = filesize& * percent / 100
>>   gt1& = GetTickCount&
>>   Open ColDir$ For Binary As #FF%
>>   a$ = Input$(sample&, FF%)
>>   Close FF%
>>   gt2 = GetTickCount&
>>   GetDownloadTime = (gt2& - gt1&) * CDbl(filesize&) / sample& / 1000

>> End Function

> Seems a convoluted method, try something like this:
> (Warning: Unchecked air code!)

> ff = freefile
> open ColDir$ for binary as ff
> a$ = space(LOF(ff))
> get ff,,a
> Close ff

Well, it's actually no more convoluted for the actual reading of the file.
You're using get ff,,a. I'm using a=Input(size,ff). A single command, but a
different command. Turns out there's a world of difference in speed.

Quote:
> I think your bottleneck MIGHT be the time VB takes to create the string a,
> I suspect your method above builds it up line by line which is inherently
> slow, the method I show creates a blank string of the correct length
> before it tries to read the file.

Actually, if I use it as a normal test file, and use the Input statement to
bring it in line-by-line, it's pretty speedy. About 0.3 seconds for 150,000+
lines in a 5.5 MB file. Your method using Get is much quicker, but it wasn't
slow before. However, using Input(size,ff) takes over 6 seconds. And
incredibly slower method, as it turns out.

--
Regards,

Rick Raisley
heavymetal-A-T-bellsouth-D-O-T-net



Sun, 16 Jan 2011 23:49:24 GMT  
 
 [ 8 post ] 

 Relevant Pages 

1. I am really stuck here....

2. Word and ASPs...I really am stuck

3. Am I _REALLY_ connected?!?!

4. Refreshing really slow...

5. DoEvents REALLY slow!

6. Really slow scan

7. Treeview - really slow on the first refresh

8. Is Remote Automation really this slow?

9. Is VB4 really slow

10. Is VB4 really slow

11. VB to Excel 5 - DDE/OLE Really Slow???

12. Is ListView.ListItems.Add REALLY that slow??

 

 
Powered by phpBB® Forum Software