Support Questions

Find answers, ask questions, and share your expertise

Are LLMs built into CML or do we only access them through huggingface

avatar
New Contributor

We're doing a bit of research here and trying to understand if LLMs are only available by accessing them through git and huggingface or whether there are some in built LLMs accessible using CML. 

Thank you

1 ACCEPTED SOLUTION

avatar
Expert Contributor

Hi, you can build your own LLMs in CML, or you can access existing ones in huggingface. The HuggingFace ones can be installed as an AMP, see here: https://docs.cloudera.com/machine-learning/cloud/applied-ml-prototypes/topics/ml-huggingface.html#co...

This is the quickest way to get started. The actual model is still stored on HF so if your cluster is airgapped you will have some trouble using the AMP feature - it's possible but is kind of a pain.

Cloudera does not SHIP any LLMs as far as I know.

View solution in original post

4 REPLIES 4

avatar
Community Manager

@paulfg Welcome to the Cloudera Community!

To help you get the best possible solution, I have tagged our CML experts @bbreak @cravani  who may be able to assist you further.

Please keep us updated on your post, and we hope you find a satisfactory solution to your query.


Regards,

Diana Torres,
Community Moderator


Was your question answered? Make sure to mark the answer as the accepted solution.
If you find a reply useful, say thanks by clicking on the thumbs up button.
Learn more about the Cloudera Community:

avatar
Community Manager

@Gopinath @Mike Do you have any insights here? Thanks!


Regards,

Diana Torres,
Community Moderator


Was your question answered? Make sure to mark the answer as the accepted solution.
If you find a reply useful, say thanks by clicking on the thumbs up button.
Learn more about the Cloudera Community:

avatar
Expert Contributor

Hi, you can build your own LLMs in CML, or you can access existing ones in huggingface. The HuggingFace ones can be installed as an AMP, see here: https://docs.cloudera.com/machine-learning/cloud/applied-ml-prototypes/topics/ml-huggingface.html#co...

This is the quickest way to get started. The actual model is still stored on HF so if your cluster is airgapped you will have some trouble using the AMP feature - it's possible but is kind of a pain.

Cloudera does not SHIP any LLMs as far as I know.

avatar
New Contributor

Thanks Mike - I thought this was the case.