Member since
02-27-2020
173
Posts
42
Kudos Received
48
Solutions
My Accepted Solutions
Title | Views | Posted |
---|---|---|
1032 | 11-29-2023 01:16 PM | |
1139 | 10-27-2023 04:29 PM | |
1127 | 07-07-2023 10:20 AM | |
2456 | 03-21-2023 08:35 AM | |
877 | 01-25-2023 08:50 PM |
07-09-2020
11:30 AM
Starting from Ambari 2.7.5 the repositories require username and password that you get from Cloudera if you have a required support contract. Existing users can file a non-technical case within the support portal (https://my.cloudera.com) to obtain credentials. You can find more information on Accessing Ambari Repositories page.
... View more
07-08-2020
10:05 AM
Glad you are making progress. The command you are looking for is actually LOAD DATA LOCAL INPATH ... Note that you missed LOCAL keyword. Without the LOCAL keyword, hive will look for a table in hdfs, which is why you see the error "No files matching path hdfs://quickstart.cloudera:8020/users/melissava...". It's because hive is trying to look in the hdfs, instead of you local machine.
... View more
07-06-2020
10:41 PM
Look at the quote character at the start and end of this string. '/users/melissavallejos/desktop/proyectobigdata/AccidentesBicicletas_2018.csv’ They are not the same. Replace the last character with single quote (') and the command should be parsed just fine.
... View more
06-30-2020
08:01 AM
You can do this: !pip3 install sklearn This will install the needed package. Note that ! is a magic command that executes the command not in your Python session but in the underlying OS environment.
... View more
06-29-2020
09:29 AM
1 Kudo
This problem is typically solved by either (a) clearing cookies and restarting your browser; and/or (b) logging out and back into CDSW. Let me know if that works for you.
... View more
06-27-2020
10:48 PM
1 Kudo
Hi Guy, Please try adjusting your command to the following: ozone sh volume create --quota=1TB --user=hdfs o3://ozone1/tests Note the documentation states that the last parameter is a URI in the format <prefix>://<Service ID>/<path>. Service Id is what you found in ozone-site.xml.
... View more
06-12-2020
09:48 AM
Hi @Maria_pl , generally speaking the approach is as follows: 1. Generate a dummy flow file that will trigger (GenerateFlowFile processor) 2. Next step is UpdateAttribute processor that sets the start date and end date as attributes in the flow file 3. ExecuteScript is next. This can be a python script, or whichever language you prefer, that will use the start and end attributes to list out all the dates in between. 4. If your script produces single file output of dates, you can then use SplitText processor to cut each row into its own flow file and from there each file will have its own unique date in your range. Hope that makes sense.
... View more
05-26-2020
07:48 PM
Ok, so regarding single quotes vs. double quotes, you have to use double quotes in shell every time. Text in single quotes is treated as liternal (see p.271 of HBase Definitive Guide). After some more research I came across this post which seems to describe your problem exactly, along with two solutions on how to modify your Java code. To summarize, Java client for HBase expects row keys to be in human readable format, not their hexadecimal representation. Solution is to read your args as Double type, not String. Hope that finally resolves it.
... View more
05-26-2020
02:52 PM
Perhaps something about how Java interprets the args you pass to it when you run your code? It may be different from how shell client interprets them (relevant discussion here). Can you show how the command that executes your Java code, complete with the arguments passed to it? Also, include printed arguments (e.g. System.out.println(rowId)) in your code. Execute the code for the same key as you did in shell (i.e. \x00\x0A@E\xFFn[\x18\x9F\xD4-1447846881#5241968320)
... View more