Current archive: 2002.01.08;
Quick access to the ORACLE DB Find similar branches← →
Yelchev (2001-12-03 10:26) 
There is a problem There is a database outside the order of a million records with blob fields. I use the Oracle Direct Acess component. Reading takes a lot of time. How to optimize table reading by time and what are the components of faster access to the database. For example, I’ll say that to select 25000 records each with a volume of 20K you need 15 min
Владислав (2001-12-03 11:22) 
The text of the request to the studio.
petr_v_a (2001-12-03 11:24) 
Faster components are unlikely to exist, if someone is faster, then not at times. Counter-question - why drag 25000 records onto a client? The user will no longer accept 50-100 records in the grid, will want an additional "search in found", etc. delights. At the base level, read about the LOB storage settings, there are 100 pages
Yelchev (2001-12-03 12:22) 
The user doesn’t light up at all; the next one is selected from the records. The massiv array is then they go through mathematical calculations and a small list is displayed at the end of the comparison. No grids are used. Can someone tell me in general how to solve the problem of quick work with such a database. After all, with the volume in 1000000 records in general, the reading process takes half a day! I will be very grateful to all who answered))
Yuvich (2001-12-03 12:44) 
Ahh, so it means that not the sample lasts ~ 15min., But the processing of blobs lasts 15min. Here you need to look: what is stored in the blob - structured information or not. If structured, then it is necessary to put it in the tables and do the processing not at the sampling stage, but at the stage of writing to the table. If not structured, then still try to present the information in the form of a structure. As one mathematician said: "there is no such subject area that could not be represented as a hierarchical structure."
Yelchev (2001-12-03 12:50) 
No, not at all like that. I did experiments simply on data sampling without any processing. But data cannot be structured due to the fact that these are arrays that describe the processed image and cannot be divided! In general, where can I read something about solving such problems associated with the use of extra large databases
petr_v_a (2001-12-03 13:33) 
This is worse. carefully read about LOB storage options :). For v $ session_wait and v $ system_event, look at what time is really wasted. If such BLOB volumes are calculated, maybe you should think about external procedures. In general, in my opinion, Oracle does not pump large volumes very well for a client.
Yelchev (2001-12-03 13:55) 
And how much I will gain in time if I embed the comparison in the shell (if it is possible and do not download information to the client, provided that comparing arrays stored in wobbs requires Fourier transform and other arithmetic operations + allocation of a large amount of memory (up to 29M)
Mick (2001-12-03 14:05) 
If Oracl is on Wintel, then I would transfer the processing of blobs to the server. That is the usual three-link.
Yelchev (2001-12-03 14:11) 
"That is, the usual three-link"? I apologize for your lack of understanding, but what does this mean?
petr_v_a (2001-12-03 14:29) 
The easiest way to measure the time gain is by writing
for cr in (<your request>) loop
and seeing how much is done it
Yuvich (2001-12-03 15:48) 
Mike correctly says - you need to transfer the processing to the server, then the processing speed will depend on the capacity of the server, and not on the capacity of the client.
Even if Oracle is not in Wintel, you can write a PL / SQL procedure and call another procedure from it called external and written, for example, in C or Cobol. Another thing is that you need to know the language of the OS that Oracle stands on and supports the Oracle call. To be more precise - you need to read the documentation.
petr_v_a (2001-12-03 16:29) 
“called external” can be written both in Delphi and in assembler, the main thing is the sysnal calling conventions. As for Wintel - there is a gorgeous phrase in the documentation, (my translation) - "external procedures are supported on any platform that supports DLLs, for example Solaris" :))
Yuvich (2001-12-03 17:12) 
I don’t think that a DLL written in Delphi can be used on Solaris, that's why I say - you need to write, even in assembler, in the language whose compiler is in the OS.
Concerning the "smart phrase", some addition: ... supporting DLLs or dynamically loaded, shared access libraries ... for example Solaris .so libraries.
Yuvich (2001-12-03 17:15) 
From the same documentation phrase:
So, some tasks are more quickly or easily done in a lower-level language such as C, which is more efficient at machine-precision calculations. For example, a Fast Fourier Transform (FFT) routine written in C runs faster than one written in PL / SQL.
petr_v_a (2001-12-03 17:53) 
> Yuvich :) Well, those, I didn’t intend to use the DLL written in Delphi on Solaris :) The point was that you could write on anything, if only the call was sish. That, it should start, of course :)
Yuvich (2001-12-03 18:02) 
Nothing personal. Perhaps I didn’t understand something.
ASV (2001-12-04 03:11) 
And the arithmetic here is very simple. Grid, then you probably have 10MB?
So 25000 * 20К = 500000К that, with an average network bandwidth of 600К / sec, gives 13,8 minutes.
And there’s nothing to do except transfer the calculation to the server.
Pages: 1 whole branch
Current archive: 2002.01.08;
Memory: 0.6 MB
Time: 0.034 c