Hi, I am using TEMP_TABLE
to prepare data and then load it, it works great. After the data is loaded, should I drop the TEMP_TABLE
to clean up memory? If yes, is there any special syntax and when should I drop it?
Also, I noticed that after using TEMP_TABLE
, the loading sometimes gets stuck, and I have to do gadmin restart all -y
rather frequently. When it is stuck, I saw this in log:
[INFO] Start loading /home/tigergraph/mydata/mydata.csv, LineBatch = 8192, LineOffset = 1, ByteOffset = 140
0223 17:02:59.458 I event/command_consumer.go:56] Received cmd. Name: ReportKafkaStatus, Id: ab96bc427c284357a5894348766d074e, CreatedTimeStamp: 1614099778798289634, DeadlineTimeStamp: 1614099793798289634
0223 17:02:59.570 D event/command_consumer.go:58] Cmd Id: ab96bc427c284357a5894348766d074e, Parameter: type_url:"type.googleapis.com/tigergraph.tutopia.common.pb.Bytes" value:"\n\014\010\001\020\001(:0\302\352\324\201\006"
E0223 17:03:23.047664 36894 heartbeat_client.cpp:211] Issue client session write failed. rc:kNoResource round:39
E0223 17:03:26.408869 36894 heartbeat_client.cpp:211] Issue client session write failed. rc:kNoResource round:42
E0223 17:03:30.693869 36912 gdict.cpp:584] RPC timeout reached, no retry, timeout: 5000
E0223 17:03:30.698330 36912 gbrain_service_manager.cpp:366] Can not list service root: /__services, rc: kTimeout
0223 17:03:30.969 I event/command_consumer.go:56] Received cmd. Name: ReportKafkaStatus, Id: 54a2a517bc804fe89b2832125afedf60, CreatedTimeStamp: 1614099810719235860, DeadlineTimeStamp: 1614099825719235860
Thanks,
John