I have a huge JSON file (10m rows and still counting as it's not fully opened yet) however 99.9% is useless to me. There are about 2200 records in the following format
The problem is the geometry array (if that is the right term) is a list of co-ordinates with anything from 7 - 21 thousand lines that are useless to me, this means the overall file is 5gb+
At the moment I am going through each record and manaully deleting the array, is there a way to automate it? I don't want to spend an day going through all 2200 records just to be given a new version at some point meaning I will have to do it again.
Code:
{ "type" : "FeatureCollection", "name" : "%filename%", "features" : [ { "type" : "Feature", "geometry" : {=} "properties" : {=} }, { "type" : "Feature", "geometry" : {=} "properties" : {=} }, . . . ] }
At the moment I am going through each record and manaully deleting the array, is there a way to automate it? I don't want to spend an day going through all 2200 records just to be given a new version at some point meaning I will have to do it again.
Comment