Saturday 18 May 2019

Copying file geodatabase on ftp site to local disk using Python?


There is a file geodatabase sitting on an ftp site that I would like to download with a Python script. Right now I'm thinking one way to do this is to copy the ftp geodatabase to a geodatabase on my computer. Below is the script I've started. Does anyone know how I can alter this script so that I obtain ftp gdb? Thank you




Below is my final, working code as based on the answer @om_hennners provided.


import arcpy, os, sys

from arcpy import env
arcpy.env.overwriteOutput = True
from ftplib import FTP

directory = "/group/geodb" #location of gdb on ftp
folder = "D:\\temp\\"
out_gdb = "data.gdb"
out_path = folder + os.sep + out_gdb
copy_gdb = "hydro.gdb" # This is the gdb I would like to copy from the ftp site
ftp = FTP("10.4.2.22")

ftp.login("user", "pass")

ftp.cwd(os.path.join(directory, copy_gdb))
print "Changed to " + os.path.join(directory, copy_gdb)

filenames = ftp.nlst()
print filenames

print "starting to write"
for f in filenames:

with open(os.path.join(out_path, f), 'wb') as local_file:
ftp.retrbinary('RETR '+ f, local_file.write)


ftp.close()
print "closed ftp connection"

Answer



In this case you don't need to be using the arcpy libraries to copy the geodatabase. Instead you're looking at copying files across a ftp connection, which you can do with the ftplib retrbinary command.


Also note that the file system treats geodatabases as folder objects with a set of files inside them. i.e. they're not a single binary file that can be transferred in one hit using ftplib.


So really what you want to do is create a local folder called data.gdb, and then on the ftp server loops through all the files in hydro.gdb and download them. Something like the following should work (with a bit of code borrowed from this stack overflow answer as I don't know ftplib very well):



import os
import os.path
from ftplib import FTP

directory = "/group/geodb" #location of gdb on ftp
copy_gdb = "hydro.gdb" # This is the gdb I would like to copy from the ftp site

folder = "D:\\temp\\"
out_gdb = "data.gdb"
out_path = os.path.join(folder, out_gdb)


#First, create the out geodatabase as a folder
os.mkdir(out_path)

#FTP logon
ftp = FTP("10.4.2.22")
ftp.login("user", "pass")

#Again, treat the gdb as a folder and navigate there
ftp.cwd(os.path.join(directory, copy_gdb))

print "Changed to " + os.path.join(directory, copy_gdb)

#Now get a list of all files in the folder
filenames = ftp.nlst()
print filenames

#and loop through the filenames to download the files to your local 'gdb'
for f in filenames:
with open(os.path.join(out_path, f), 'wb') as local_file:
ftp.retrbinary('RETR '+ filename, local_file.write)


ftp.close()
print "closed ftp connection"

No comments:

Post a Comment

arcpy - Changing output name when exporting data driven pages to JPG?

Is there a way to save the output JPG, changing the output file name to the page name, instead of page number? I mean changing the script fo...