How to remove a batch of vertices or edges using REST API?

there is rest api for single vertex or edges removal, is there a way to do it in batch?

curl -X DELETE "http://server_ip:9000/graph/{graph_name}/vertices/{vertex_type}[/{vertex_id}]"
curl -X DELETE "http://server_ip:9000/graph/{graph_name}/edges/{source_vertex_type}/{source_vertex_id}[/{edge_type}[/{target_vertex_type}[/{target_vertex_id}]]]

Actually, you can remove multiple vertices and edges in one round with those endpoints.

  • The vertex ID in DELETE /vertices is optional. If you do not specify an ID, then you can use an additional parameter called filter that allows you to specify a set of simple conditions over the vertex attributes, so you can delete a bunch of vertices meeting that condition(s) in one round.
    NOTE: the primary ID is normally not an attribute, so you can’t refer to it in the filter. If you need to do it, you need to add WITH primary_id_as_attribute="true" to the vertex definition.

  • Deleting multiple edges is a bit more restricted, as you must specify a source vertex ID, so you can’t just delete all edges having a specific attribute value; you would need to iterate through all relevant source vertices, and delete the corresponding edges. But if you want to delete some edges originating from the same vertex, you can use the filter parameter here too.
    The workaround for this limitation is to write a query that deletes the edges you don’t need. GSQL is quite flexible and versatile; you can write a query that accepts some value(s) as parameters and delete edges accordingly.

In general, you are free to extend the REST API with whatever functionality you need (as long as it’s data manipulation). Since the installed queries come with REST endpoint, you can use these queries just like the official (i.e. built-in) end points.

What’s more, you do not even need to install queries: through POST /gsqlserver/interpreted_query you can immediately run queries (skipping the lengthy installation step). There are some limitations on what an interpreted query can do (vs. installed ones), plus of course, interpreted queries are slower (although by not that much), but in your app (that uses the REST API) you can even generate the query body!

I have created a Postman collection for the TigerGraph REST++ API, all endpoints with comments and docs, you might want to have a look at it.

Also, we already have a connector for Python, to make workign with the REST API easier for data scientist and engineers. If Python is not your tool, let us know what is. We might be able to develop a new connector (might be, not guaranteed!).


Hi @Szilard_Barany,
Thank you for the very detailed and informative answer. We mainly use java in the application. Do we have a java connecter to TigerGraph? My purpose is to make a data loader (that is where my question for using api to install the loading job comes from) and remover (to delete the data loaded for testing purposes) for all the testing data we will put there.


No, TigerGraph does not currently have a Java connector, but it’s most likely the next one I will look at.

Why to load data via REST API? The loading job mechanism is very easy to use and have good performance. Where is you data coming from? Streaming?

Well, I want to automate the process so that people only need to prepare the testing data and specify the file path, the test should be taken care by the code seamlessly from loading the data all the way to remove it after finishing the tests if that makes sense. Maybe I’m missing something here, please kindly point out.