How to delete/rename the files/folder in Azure data lake and blob store using spark scala ?
Category: azure data lake
Question
Pradeep Ravi on Tue, 10 Jul 2018 10:50:24
I am using Data bricks Scala notebook , processing the files from data lake and storing again in data lake and blob store. I see some unwanted log files are stored along with data file. Hence I need a Scala based solution to rename/delete the files/folder in Azure data lake and blob store which can be executed within Scala notebook.
Replies
VairavanS (Azure) on Tue, 10 Jul 2018 20:49:59
For azure data lake, You can try to rename or delete a file by calling these rest endpoints using spark scala:
Please let me know, if that helps.
Simon_Peacock on Fri, 23 Nov 2018 10:34:59
I've a process using Azure Databricks that writes out to the Data Lake in parquet and I use the following to drop the top level folder that gets created in the parquet write...
dbutils.fs.rm("adl://XXXXXXXX.azuredatalakestore.net/<FOLDER_PATH>", true)
reb2121 on Thu, 23 Jan 2020 13:37:05
can this be achieved in data lake gen2?