Skip to content Skip to sidebar Skip to footer

Writing Python Dataframe To Mssql Table

I currently have a Python dataframe that is 23 columns and 20,000 rows. Using Python code, I want to write my data frame into a MSSQL server that I have the credentials for. As

Solution 1:

When you say Python data frame, I'm assuming you're using a Pandas dataframe. If it's the case, then you could use the to_sql function.

df.to_sql("MODREPORT", connection, if_exists="replace")

The if_exists argument set to replace will delete all the rows in the existing table before writing the records.

Solution 2:

I realise it's been a while since you asked but the easiest way to delete ALL the rows in the SQL server table (point 1 of the question) would be to send the command

TRUNCATETABLE Tablename

This will drop all the data in the table but leave the table and indexes empty so you or the DBA would not need to recreate it. It also uses less of the transaction log when it runs.

Post a Comment for "Writing Python Dataframe To Mssql Table"