Friday 15 June 2012

google app engine - Import Data Efficiently from Datastore to BigQuery every Hour - Python -



google app engine - Import Data Efficiently from Datastore to BigQuery every Hour - Python -

currently, i'm using google's 2-step method backup datastore , import bigquery. reviewed code using pipeline. both methods not efficient , have high cost since info imported everytime. need add together records added lastly import.

what right way of doing it? there working illustration on how in python?

you can @ streaming inserts. i'm looking @ doing same thing in java @ moment.

if want every hour, maybe add together inserts pull queue (either serialised entities or keys/ids) each time put new entity datastore. process queue hourly cron job.

python google-app-engine gae-datastore google-bigquery google-datastore

No comments:

Post a Comment