Skip to main content

Posts

India AI Impact Summit 2026

  The  India AI Impact Summit 2026 , held from February 16–20 at Bharat Mandapam in New Delhi, has emerged as a landmark event for the Global South. As of today, February 19, 2026, the summit is nearing its conclusion with several historic outcomes focused on democratizing technology and sovereign AI development.   The summit, themed " People, Planet, and Progress ," has moved beyond theoretical debate into concrete financial and structural commitments. 1. Major Financial & Infrastructure Commitments The most tangible outcome is the massive scale of investment and infrastructure pledged to make India a global AI powerhouse:   **₹20,000 Crore Investment: Global and domestic investors finalized commitments exceeding ₹20,000 crore to be deployed over the next two years into India's AI ecosystem.   GPU Democratization : Under the IndiaAI Mission, the government announced the onboarding of over 38,000 GPUs. Crucially, these are being made available...
Recent posts

How to setup automated export on google cloud sql

We can do backup and restore on instance and DB level both. on-demand backup and automatic backup are possible at instance level only which we can configure Google Cloud Console. Database level there is no options and features available for full and differential backup like SQL server in cloud SQL database. We can do this task with Export feature this is similar to full backup and this we can do from console or portal. If we want to do automated DB export (full backup) then we should design ourselves with the help of services . Google provide some services like Google Function and Scheduler which we can use for this purpose as look like following architecture. Here is following steps which we can use to setup Create a bucket in Google Cloud Storage. Create Cloud Function to export a Cloud SQL database Grant Permission for the Cloud Function to access Cloud SQL export Test out the Cloud Function Create Cloud Scheduler Job to trigger the Cloud Function once a week  For detail or step...

Why New mongodb performance level improves 7x

The storage engine is the core component of  any database which is responsible for handling  from disk to memory. Mongodb Support multiple storage engine, Choosing the appropriate storage engine for your use case can significantly impact the performance of your applications. WiredTiger Storage Engine In-Memory Storage Engine MMAPv1 Storage Engine MongoDB removes their initial storage engine MMAPv1 and now WiredTiger Storage Engine is default storage engine. So we should understand what the reason is behind these changes. MMApV1 storage engine : It allows Concurrency control level on collection, so write performance is just good. This engine does not allow Compression support. WiredTiger Storage Engine :  Concurrency control in wiredtiger is maintaining on document level. It uses WiredTiger uses MultiVersion Concurrency Control (MVCC) like other RDBMS, So write performance is excellent.  Yes this engine allows compression. With WiredT...

ETL Features of SQL Data Platform

Either Datawarehouse or BigData, ETL or ELT is the major part, or you can say it plays a vital role in Data Transformation Life cycle. ELT-ETL used in any kind of data related operations like, Data migration, Data transformation , Business intelligence, Datawarehouse, Big data and Analytics. ETL is Extract transform and Load whereas ELT is extract load then transform. Both have their own pros and cons but now a days ELT is more popular than ETL. This is why because size of data is growing & growing for transformation. ELT also supporting Data virtualization concept where actual original data will reside on system without any modification and transformation will less. SQL server has reach sets of tools for ETL & ELT .  Here I am sharing some important Differences between ETL and ELT which we should know before jumping on features. SrlNo Dimension ETL ELT 1 Technology Adoption ETL is a well-developed process used for over 20 years ELT is a new technology ...

Postgres New JSON data type Support of Static and Dynamic schemas

Now we can create static and dynamic both kind of schema relational table so a new kind of heterogeneous data structure can be created in Relational tables.  Relational database that is much more concerned with standards compliance and extensible than with giving you freedom over how you store data. It uses both dynamic and static schema and allows you to use it for relational data and normalized form storage. Static Relation data structure in Postgres CREATE TABLE Project (  id serial PRIMARY KEY,  name varchar,  mgr integer,  is_active boolean ); CREATE TABLE Task(  id serial PRIMARY KEY, name varchar, status boolean, project_id integer, CONSTRAINT idx_project_id FOREIGN KEY (project_id) REFERENCES project (id) ); Dynamic Relation data structure in Postgres CREATE TABLE IF NOT EXISTS project ( id serial PRIMARY KEY,  name varchar,  mgr integer,  tasks jsonb , is_active boolean ); Keeping JSON Document inside Postgres Table ...

Changes in RDBMS after JSON

Hi friend, I am continuing my post over my series article "Is it time to revisit PostgreSQL and MySQL with JSON support"  . Today I am sharing my 4th post which is related to " Changes in RDBMS after JSON " JSON bring changes on various vendors and technologies so new kind of challenges came to introducing this feature in their technology. To cover-up these challenge RDBMS started inclusion of JSON data type in their engine. Sooner or later all major RDBMS vendor (eg PostgreSQL, mySQL and SQL Server) added JSON as a data type to keeping document in relational table.  These are the database version when JASON introduced :- . Technology Postgres mySQL SQL Server Version PostgreSQL 9.3 MySQL 5.7.8 SQL 2016 Now a new heterogeneous database kind of structure started creating with relational engine. Static and dynamic data structure in RDBMS started introducing. Mean Relational Table =Normalization + denormalized (JSON/Array).  JSON data Passing done via stan...

3rd day of Lock down in India

It’s 3rd day of Lock down in India . Today I started morning with some tips that will help to working from home during lockdown. Here I am going to share top 10 Tips while working from home due to corona virus pendamic outbreak . Avoid working while lying in a bed or relaxing in a couch/sofa/reclining chair. These comfortable setups will make it difficult to focus on work. Ideally a study table and chair would be conducive to staying focused and maintaining productivity. Maintain a routine, get up at a reasonable time, shower and have breakfast/lunch at the usual times. Also ensure you log off at reasonable times and get sufficient sleep at night. Try and emphasize to your family that work from home times are not holidays and request them to keep interruptions to a minimum. Take a 2-3-minute break from work every 30-45 minutes to just stretch, walk around or do some simple exercises. Setup and maintain regular meetings with your managers and teams. Keep people updated abou...