I have recently gotten a large database (>40mb) in a txt format that I need to import into mySQL. Sadly, phpMyAdmin has a maximum of 25mb for file uploads so I could not import it through that, and instead you have to use the mysqlimport through the command line.
First, create/find a database to put your folder in. Next, create a table with the same exact name and structure as it is in the CSV/TXT file.
Next, to the command line prompt of the server of your database and enter the folder where your csv/txt files are. Then, run this command:
mysqlimport -uroot -psecret --local --fields-terminated-by 'delimiter' dbname filename
- replace root, secret with your username and password
- replace delimiter with whatever is separating the columns in your file (usually commas, in my case it was '|')
- replace dbname and filename with your database name and your filename including extension. (ex, database.csv or census.txt)
If you are on MAMP on OSX then use /Applications/MAMP/library/bin/mysqlimport rather than mysqlimport.
If all went well, you should see something like this:
filename.database: Records: xxxx Deleted: x Skipped: x Warnings: xxxx
If you get warnings, don't worry about them until you see the table itself. I got a warning on every single row but it turned out to be just fine. If you are really curious, then add the parameter --verbose after --local. You can turn up the man page of mysqlimport by running the command with no parameters.
Good luck with your database, and let me know in the comments what worked well and what didn't!
Welcome to WordPress. This is your first post. Edit or delete it, then start blogging!