loading big files in VRAM

Par iamweasel2

Paladin (670)

Portrait de iamweasel2

30-11-2020, 04:10

I modified this code written by Giovanni Nunes in order to load a .sc2 file from disk to RAM, and from RAM I send the data to VRAM to display the image.

The code works ok, but if I want to read a big file (for instance, an SC8 image, that has > 50 kb) ? I understand that I will have to change the record size being read (which would be the best size choice in order to not get the code too much slower ?), and keep reading / transfering, but How can I do that?

; load the contents of a file to VRAM
;
;
OpenFile: equ 0x0f
UpdateFile: equ 0x10
SetBuffer equ 0x1a
GetFileSize equ 0x23
ReadRecords: equ 0x27

CHGMOD: EQU 005FH
ldirvm: equ 0x005C
CHGET equ 0x009F

BDOS: equ 0xf37d

;
; macro that creates a FCB structure
;
macro ____fcb_struct,data
drive_of_ ## data
ds 1,0
name_of_ ## data
ds 8,0
extension_of_ ## data
ds 3,0
current_block_of_ ## data
ds 2,0
record_size_of_ ## data
ds 2,0
size_of_ ## data
ds 4,0
date_of_ ## data
ds 2,0
time_of_ ## data
ds 2,0
reserved_of_ ## data
ds 8,0
current_record_of_ ## data
ds 1,0
record_of_ ## data
ds 4,0
endm

org 0x9000-7

db 0xfe
dw programStart
dw programStop
dw programStart
programStart:
jr loadFile
filename:
db "DOUBLEDR"
db "SC2"
filesize:
dw 14343
buffer:
dw fileContent
error:
db 0xff
file:
____fcb_struct file

loadFile:
;
; put the name of the file in FCB
;
ld bc,8+3
ld hl,filename ; copy name of the file and extension
ld de,name_of_file ; to 'file'
ldir
;
; sets the RAM address that will receive the block read from file
;
ld de,fileContent ; sets the ram region to receive the data from file
ld c,SetBuffer
call BDOS
;
; 3 – open file
;
ld de,file ; points to 'file'
ld c,OpenFile ; open file
call BDOS
cp 0xff ; did it return 'File not found'?
jr z,errorCatch ; goes to error treatment
;
; Gets the size of the file
;
ld de,file ; points to 'file'
ld c,GetFileSize ; recover file size
call BDOS
cp 0xff ; anything wrong?
jr z,errorCatch ; goes to error treatment
;
; *** verify if the size of the file is correct ***
;
ld de,(size_of_file) ;
ld hl,(filesize) ;
sbc hl,de ;
ld a,h ;
xor l ;
cp 0
jr nz,errorCatch ;

; place the file pointer to the beginning of the file
;
ld hl,0 ;
ld (record_of_file),hl ;
ld (record_of_file+2),hl ;

; Defines the block size to be read (record size).
;
ld hl,(filesize) ; recovers the size
ld (record_size_of_file),hl ; everything is read at once

; transfer content of file to RAM.
;
ld hl,1 ; reads 1 record of 13343 bytes
ld de,file
ld c,ReadRecords ;
call BDOS
cp 255
jr z,errorCatch ;

; close file
;
ld de,file
ld c,UpdateFile ;
call BDOS

xor a
ld (error),a ; colocar '0' como código de erro

ld a,2 ; screen 2
call CHGMOD

di
ld a, 0x00 ; set the border color to black
out (0x99), a
ld a, 0x87
out (0x99), a
ei

ld hl,fileContent+7 ; send all image data to VRAM
ld de, 0
ld bc, 14343
call ldirvm

call CHGET ; waits for input

ret

errorCatch:
ld a,255
ld (error),a
ret

fileContent:
programStop:
db $

!login ou Inscrivez-vous pour poster

Par Sandy Brand

Master (215)

Portrait de Sandy Brand

30-11-2020, 17:39

Hmm code is confusing: comment says it is going to read 13343 bytes in one go, but 'filesize' is actually defined as 14343?

Anyways, you should be able to modify the code by adding a loop and to read the entire file in chunks.

So: read data from file in temporary temporary buffer, write data from temporary buffer into VRAM, repeat until entire file processed.

You can set the FCB record size to 1 byte so that you can explicitly tell BDOS how many bytes to load from the file (so this value should then be min(size_of_temp_buffer, unread_bytes_remaining) for every pass through the loop.

Par NYYRIKKI

Enlighted (5691)

Portrait de NYYRIKKI

30-11-2020, 19:03

iamweasel2 wrote:

which would be the best size choice in order to not get the code too much slower ?

There is no absolute answer since each disk interface works in their own way. How ever for big files I would suggest loading always blocks that are multiply of 512 bytes. ie. one cluster on 720kB floppy is 2kB and one track is 9kB, so ie. 18kB sounds pretty nice value. Starting the load causes always bigger or smaller extra delay, so there is quite a good rule: "Bigger is better" You can pretty easily see the speed effect on BASIC by playing a bit with CLEAR & BLOAD"FILE.SC8",S

Par iamweasel2

Paladin (670)

Portrait de iamweasel2

30-11-2020, 19:30

Sandy Brand wrote:

Hmm code is confusing: comment says it is going to read 13343 bytes in one go, but 'filesize' is actually defined as 14343?

That was just the quick and dirty test to see the code was working. The image used in the test had exactly 14343 bytes.
This will be fixed to use filesize in the next revision.

Sandy Brand wrote:

Anyways, you should be able to modify the code by adding a loop and to read the entire file in chunks.

So: read data from file in temporary temporary buffer, write data from temporary buffer into VRAM, repeat until entire file processed.

You can set the FCB record size to 1 byte so that you can explicitly tell BDOS how many bytes to load from the file (so this value should then be min(size_of_temp_buffer, unread_bytes_remaining) for every pass through the loop.

That's the info I am missing, if I set the record size to be read to 1000 bytes, how do I know how many bytes were actually read (say the file has only 534 bytes, how do I know that only 534 bytes were read?)

Par iamweasel2

Paladin (670)

Portrait de iamweasel2

30-11-2020, 19:32

NYYRIKKI wrote:
iamweasel2 wrote:

which would be the best size choice in order to not get the code too much slower ?

There is no absolute answer since each disk interface works in their own way. How ever for big files I would suggest loading always blocks that are multiply of 512 bytes. ie. one cluster on 720kB floppy is 2kB and one track is 9kB, so ie. 18kB sounds pretty nice value. Starting the load causes always bigger or smaller extra delay, so there is quite a good rule: "Bigger is better" You can pretty easily see the speed effect on BASIC by playing a bit with CLEAR & BLOAD"FILE.SC8",S

Thanks, so it will be 18 kB. I plan to read images (SC2, SC5, SC8), so this seems to be a good size. Smile

Par NYYRIKKI

Enlighted (5691)

Portrait de NYYRIKKI

30-11-2020, 20:24

iamweasel2 wrote:

That's the info I am missing, if I set the record size to be read to 1000 bytes, how do I know how many bytes were actually read (say the file has only 534 bytes, how do I know that only 534 bytes were read?)

I suggest you to swap the idea in your head... Instead of loading 1 record of 1000 bytes, load instead 1000 records of 1 byte. This way in the example you mentioned 534 will be placed to HL. This whole idea of records comes from CP/M operating system that MSX-DOS tries to be compatible with. You don't have such a thing as exact file size on CP/M, so this is why you need to think this way instead. (Here is some more history)