LOAD DATA LOCAL INFILE 'abc.csv' INTO TABLE abc
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\r\n'
IGNORE 1 LINES
(col1, col2, col3, col4, col5...);
对于 MySQL 8.0用户:
使用 LOCAL关键字存在安全风险,从 MySQL 8.0开始,LOCAL功能默认设置为 False。你可能会看到这个错误:
CREATE TABLE IF NOT EXISTS `survey` (
`projectId` bigint(20) NOT NULL,
`surveyId` bigint(20) NOT NULL,
`views` bigint(20) NOT NULL,
`dateTime` datetime NOT NULL
);
您的 CSV 文件必须正确格式化。例如,请参阅下面附加的图像:
如果一切正常,请执行以下查询以从 CSV 文件加载数据:
注意 : 请添加 CSV 文件的绝对路径
LOAD DATA INFILE '/var/www/csv/data.csv'
INTO TABLE survey
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\r\n'
IGNORE 1 LINES;
如果在 Windows shell 中运行 LOAD DATA LOCAL INFILE,并且需要使用 OPTIONALLY ENCLOSED BY '"',那么为了正确地转义字符,必须执行以下操作:
"C:\Program Files\MySQL\MySQL Server 5.6\bin\mysql" -u root --password=%password% -e "LOAD DATA LOCAL INFILE '!file!' INTO TABLE !table! FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"^""' LINES TERMINATED BY '\n' IGNORE 1 LINES" --verbose --show-warnings > mysql_!fname!.out
LOAD DATA [LOW_PRIORITY | CONCURRENT] [LOCAL]
INFILE 'file_name' INTO TABLE `tbl_name`
CHARACTER SET [CHARACTER SET charset_name]
FIELDS [{FIELDS | COLUMNS}[TERMINATED BY 'string']]
[LINES[TERMINATED BY 'string']]
[IGNORE number {LINES | ROWS}]
看这个例子:
LOAD DATA LOCAL INFILE
'E:\\wamp\\tmp\\customer.csv' INTO TABLE `customer`
CHARACTER SET 'utf8'
FIELDS TERMINATED BY ',' ENCLOSED BY '"'
LINES TERMINATED BY '\r\n'
IGNORE 1 LINES;
LOAD DATA INFILE "csv file location" INTO TABLE tablename
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n'
IGNORE 1 LINES
(column1,column2,.....);
注意: (按照 csv 文件,列名用逗号分隔)
要从任何其他远程系统添加任何 CSV 文件,您还可以在上述步骤之后执行以下操作:
show global variables like 'local_infile';
-- Note: if the value is OFF then execute the next command below
set global local_infile=true;
-- check back again if the value has been set to ON or not
执行以下命令从任何其他系统的 CSV 文件导入数据,
LOAD DATA LOCAL INFILE "csv file location" INTO TABLE tablename
FIELDS TERMINATED BY ','
LINES TERMINATED BY '\n'
IGNORE 1 LINES
(column1,column2,.....);
-- column name separated by comma as per csv file