Yahoo Canada Web Search

Search results

  1. ai-comic-factory. like7.17k. Runningon CPU Upgrade. AppFilesFilesCommunity. 810. Refreshing. Create your own AI comic with a single prompt.

    • Hugging Face

      emoji: 👩‍🎨. colorFrom: red. colorTo: yellow. sdk: docker....

  2. huggingface.co › spaces › jbilcke-hfHugging Face

    emoji: 👩‍🎨. colorFrom: red. colorTo: yellow. sdk: docker. pinned: true. app_port: 3000. disable_embedding: true. short_description: Create your own AI comic with a single prompt. hf_oauth: true. hf_oauth_expiration_minutes: 43200. hf_oauth_scopes: [inference-api] --- # AI Comic Factory. Last release: AI Comic Factory 1.2.

  3. The AI Comic Factory is an online AI Comic Book Generator platform that allows you to generate your own comic book with the help of Hugging Face Space.

    • AI Comic Factory
    • Running the project at home
    • The LLM API (Large Language Model)
    • The Rendering API

    (note: the website "aicomicfactory.com" is not affiliated with the AI Comic Factory project, nor it is created or maintained by the AI Comic Factory team. If you see their website has an issue, please contact them directly)

    First, I would like to highlight that everything is open-source (see here, here, here, here).

    However the project isn't a monolithic Space that can be duplicated and ran immediately: it requires various components to run for the frontend, backend, LLM, SDXL etc.

    If you try to duplicate the project, open the .env you will see it requires some variables.

    Provider config:

    •LLM_ENGINE: can be one of: "INFERENCE_API", "INFERENCE_ENDPOINT", "OPENAI"

    •RENDERING_ENGINE: can be one of: "INFERENCE_API", "INFERENCE_ENDPOINT", "REPLICATE", "VIDEOCHAIN", "OPENAI" for now, unless you code your custom solution

    Option 1: Use an Inference API model

    This is a new option added recently, where you can use one of the models from the Hugging Face Hub. By default we suggest to use CodeLlama 34b as it will provide better results than the 7b model. To activate it, create a .env.local configuration file:

    Option 2: Use an Inference Endpoint URL

    If you would like to run the AI Comic Factory on a private LLM running on the Hugging Face Inference Endpoint service, create a .env.local configuration file: To run this kind of LLM locally, you can use TGI (Please read this post for more information about the licensing).

    Option 3: Use an OpenAI API Key

    This is a new option added recently, where you can use OpenAI API with an OpenAI API Key. To activate it, create a .env.local configuration file:

    This API is used to generate the panel images. This is an API I created for my various projects at Hugging Face.

    I haven't written documentation for it yet, but basically it is "just a wrapper ™" around other existing APIs:

    •The hysts/SD-XL Space by @hysts

    •And other APIs for making videos, adding audio etc.. but you won't need them for the AI Comic Factory

  4. Oct 2, 2023 · Thousands of users have tried it to create their own AI comic panels, fostering its own community of regular users. They share their creations, with some even opening pull requests. In this tutorial, we'll show you how to fork and configure the AI Comic Factory to avoid long wait times and deploy it to your own private space using the Inference ...

  5. In this tutorial, we'll show you how to fork and configure the AI Comic Factory to avoid long wait times and deploy it to your own private space using the Inference API. It does not require strong technical skills, but some knowledge of APIs, environment variables and a general understanding of LLMs & Stable Diffusion are recommended.

  6. People also ask

  7. Generate comic panels using a LLM + SDXL. Powered by Hugging Face 🤗.

  1. People also search for