Recently I got a defiance from a some developers regarding postgresql server is slow performance. In his opinion it is not able to fetch 5 lac record from CSV files. They show there aplication which was using (node.js+sequlize ORM). When executed it was really sucking in middle.
5 lac records are really in huge amount but postgre is alos being for high performance application. There is no question we can say that it can not able to fetch this amount of data. So I suggested him to start importing from short amount of records like top 6 then top 600 then top 6000 then 60000. Yeah he did and really it stopped after 60000 records. It failed from application, program is running, running and keep running..
So boll goes in my bucket then I started digging on database server configuration parameters like effective_cache_size, work_memory, Shared_buffers, Maximum_number_Connections, wal_buffers etc, given appropriate values as per best practices and current system resources. Then restart the server to try again.
We repeated the previous actions like insert in short amount like top 6 then top 600 then top 6000 then 60000. Yeah he did and really it stopped after 60000 records. Again it was fail from application.
Now I tried to insert from database console. I executed the below query and you will not believe it finished in less then 2 seconds with all data inserted into table.
CREATE TABLE fare
(
CompanyCode character varying(255),
LineNumber character varying(255),
CardTypeCode character varying(255),
FirstLocationCode character varying(255),
SecondLocationCode character varying(255),
FareAmount character varying(255)
)
copy fare(CompanyCode,LineNumber,CardTypeCode,FirstLocationCode,SecondLocationCode,FareAmount)
From E'C:\\master_part\\cmn0006_0103.tsv' with (format csv, delimiter E'\t')
delete from fare where CompanyCode='Company Code'
insert into "fareMaster"("companyCode","lineNumber","cardTypeCode","firstLocationCode","secondLocationCode","fareAmount")
select companycode,linenumber,cardtypecode,firstlocationcode,secondlocationcode,fareamount from fare ;
So ball is out of my court and when he started digging to application there are some minor changes he did at ORM side and application started fetching lacs in database. Now he believes how postgre can handle huge data insert simultaneously.
Comments
Post a Comment
Plz dont forget to like Facebook Page..
https://www.facebook.com/pages/Sql-DBAcoin/523110684456757