Member since
08-03-2016
2
Posts
0
Kudos Received
0
Solutions
08-04-2016
11:39 AM
Hi @vshukla Thanks for the answer. I don't intend to create a Linux or other machine to test as I prefer to stick with Windows. Here is the stack trace: 16/08/04 13:08:58 INFO SharedState: Warehouse path is '\\SERVER\Users\USER\STORAGE\Programming\IntelliJ\PropertyInvestmentCalcs\spark-warehouse'.
Exception in thread "main" java.io.IOException: No FileSystem for scheme: null
at org.apache.hadoop.fs.FileSystem.getFileSystemClass(FileSystem.java:2421)
at org.apache.hadoop.fs.FileSystem.createFileSystem(FileSystem.java:2428)
at org.apache.hadoop.fs.FileSystem.access$200(FileSystem.java:88)
at org.apache.hadoop.fs.FileSystem$Cache.getInternal(FileSystem.java:2467)
at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2449)
at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:367)
at org.apache.hadoop.fs.Path.getFileSystem(Path.java:287)
at org.apache.spark.sql.catalyst.catalog.SessionCatalog.makeQualifiedPath(SessionCatalog.scala:115)
at org.apache.spark.sql.catalyst.catalog.SessionCatalog.createDatabase(SessionCatalog.scala:145)
at org.apache.spark.sql.catalyst.catalog.SessionCatalog.<init>(SessionCatalog.scala:89)
at org.apache.spark.sql.internal.SessionState.catalog$lzycompute(SessionState.scala:95)
at org.apache.spark.sql.internal.SessionState.catalog(SessionState.scala:95)
at org.apache.spark.sql.internal.SessionState$anon$1.<init>(SessionState.scala:112)
at org.apache.spark.sql.internal.SessionState.analyzer$lzycompute(SessionState.scala:112)
at org.apache.spark.sql.internal.SessionState.analyzer(SessionState.scala:111)
at org.apache.spark.sql.execution.QueryExecution.assertAnalyzed(QueryExecution.scala:49)
at org.apache.spark.sql.Dataset$.ofRows(Dataset.scala:64)
at org.apache.spark.sql.SparkSession.baseRelationToDataFrame(SparkSession.scala:382)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:143)
at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:122)
at PropertyInvestmentCalcs$.main(PropertyInvestmentCalcs.scala:27)
at PropertyInvestmentCalcs.main(PropertyInvestmentCalcs.scala) 16/08/04 13:09:00 INFO SparkContext: Invoking stop() from shutdown hook 16/08/04 13:09:00 INFO SparkUI: Stopped Spark web UI at http://192.168.1.100:4040 16/08/04 13:09:00 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
16/08/04 13:09:00 INFO MemoryStore: MemoryStore cleared 16/08/04 13:09:00 INFO BlockManager: BlockManager stopped 16/08/04 13:09:00 INFO BlockManagerMaster: BlockManagerMaster stopped 16/08/04 13:09:00 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
16/08/04 13:09:00 INFO SparkContext: Successfully stopped SparkContext 16/08/04 13:09:00 INFO ShutdownHookManager: Shutdown hook called 16/08/04 13:09:00 INFO ShutdownHookManager: Deleting directory C:\Users\USER\AppData\Local\Temp\spark-523c95ef-a46c-4b16-88c6-3da8f6f2a801 Does it give sufficient additional information? Thanks in advance for any help!
... View more