Back to all posts

Midjourney Omni-Reference: Guide to --cref & --sref

2025-07-28ImaginePro9 minutes read
midjourney omni-reference

Midjourney Omni-Reference: Guide to --cref & --sref

This guide breaks down Midjourney's powerful Omni-Reference system, empowering you to create consistent characters and styles with the --cref and --sref parameters.

For years, developers and designers using AI image generators have faced a persistent challenge: consistency. Creating a character in one image and then successfully recreating them in a different pose, setting, or style was a game of chance. You could get close, but minor details would always be off, making it difficult to build storyboards, game assets, or consistent brand visuals. The Midjourney Omni-Reference system is the definitive solution to this problem.

What is the Midjourney Omni-Reference System?

The Midjourney Omni-Reference system is an umbrella term for a suite of advanced features designed to give you unprecedented control over visual consistency in your generated images. Instead of relying solely on text prompts to describe a character or style, this system allows you to provide reference images to guide the AI.

The two cornerstone features of this system are:

  1. Character Reference (--cref): For maintaining character identity across multiple images.
  2. Style Reference (--sref): For matching a specific aesthetic or artistic style.

By mastering these parameters, you can move from generating random one-off images to building cohesive visual narratives, a critical leap for any professional creative project. This guide provides a midjourney omni-reference tutorial for beginners and experienced users alike.

Creating Consistent Characters with Character Reference (--cref)

The --cref parameter is Midjourney's answer to the "consistent characters" problem. It analyzes the facial features, hair, and even clothing of a person in a reference image and intelligently applies that identity to new creations.

How to Use the --cref Parameter

Using Character Reference is straightforward. You append the --cref parameter to your prompt, followed by a direct URL to your reference image.

The basic structure is: <Your Prompt> --cref <URL to your character image>

For example, to take a character and place them on a futuristic street, your prompt would look like this:

A full-body shot of a character walking through a neon-lit cyberpunk street at night --cref https://example.com/path/to/my-character.jpg --ar 16:9 --v 6.0

Midjourney will generate an image that maintains the core identity of the character from the URL while placing them in the new scene described by your text prompt.

Best Practices for Your Character Reference Image

The quality of your output depends heavily on the quality of your input reference. To get the best results:

  • Use a Clear, Unobstructed Image: The character should be clearly visible. Avoid images with heavy shadows, creative lighting that obscures features, or other people in the frame.
  • Focus on the Subject: A simple headshot or a three-quarters body shot with a neutral background works best. The more Midjourney can focus on the character, the more accurately it can replicate them.
  • No Need for Perfection: The reference image does not need to be an AI generation. A clear photograph or a detailed illustration works perfectly.

Controlling Character Fidelity with --cw (Character Weight)

Sometimes, you may want to retain a character's face but change their outfit or other attributes. This is where the Character Weight parameter, --cw, comes in. It allows you to adjust how strongly Midjourney adheres to the reference image, with a range from 0 to 100.

  • --cw 100 (Default): Attempts to copy the face, hair, and clothing from the reference.
  • --cw 0: Focuses only on the face. This is ideal for putting a character in new outfits or historical costumes.

Here is how you would use it to change a character's clothing:

A portrait of a character as a medieval knight in steel armor --cref https://example.com/path/to/my-character.jpg --cw 0

This prompt will transfer the face from your reference image onto the body of a medieval knight, ignoring the clothing from the original image.

Matching Aesthetics with Style Reference (--sref)

While --cref handles who is in the image, --sref handles what the image feels like. The Style Reference parameter allows you to transfer the aesthetic, color palette, texture, and overall mood from a reference image to your new generation.

How to Use the --sref Parameter

Similar to Character Reference, you append --sref followed by a URL to the style you want to emulate.

A tranquil bamboo forest at dawn --sref https://example.com/path/to/vintage-watercolor-style.jpg --v 6.0

This prompt will generate a bamboo forest, but rendered in the specific vintage watercolor style of the reference image. You can even use multiple style references by providing multiple URLs separated by spaces.

Choosing Effective Style Reference Images

An effective style reference doesn't need to have a clear subject. In fact, abstract images often work best. Consider using:

  • Abstract paintings for color palettes and brush strokes.
  • Textile patterns for texture.
  • Gradient images for color schemes.
  • Images with a strong mood (e.g., grainy black-and-white film) to capture a feeling.

The key is to think about the aesthetic elements you want to borrow, not the content of the reference image itself.

What is the Difference Between --sref and Image Prompts?

This is a common point of confusion. Here’s a clear breakdown:

  • Image Prompts (using a URL at the start of your prompt): An image prompt tells Midjourney, "Make something like this." It tries to borrow heavily from the subject matter, composition, and colors of the source image.
  • Style Reference (--sref): A style reference tells Midjourney, "Make my prompt, but render it in the style of this." It isolates and transfers the abstract aesthetic DNA of the reference image without being bound to its original subject matter.

Essentially, an image prompt influences the what, while --sref influences the how.

Combining --cref and --sref: The Ultimate Workflow

The true power of the Midjourney Omni-Reference system is unlocked when you combine both parameters in a single prompt. This allows you to place a consistent character into a scene with a consistent, predetermined style. This is the ultimate workflow for creating professional-grade visual assets.

The prompt structure is simple: just include both parameters.

A portrait of a character looking out a rain-streaked window --cref https://example.com/path/to/my-character.jpg --sref https://example.com/path/to/dramatic-film-noir-style.jpg --cw 0 --ar 16:9 --v 6.0

Breakdown of the prompt:

  1. A portrait of a character...: The new scene and action.
  2. --cref ...: Defines who the character is.
  3. --sref ...: Defines the aesthetic (dramatic film noir).
  4. --cw 0: Ensures we only transfer the face, letting the film noir style dictate the clothing.

This level of control transforms Midjourney from a creative toy into a robust production tool.

Practical Use Cases for Designers & Developers

The Omni-Reference system opens up new, efficient workflows for technical and creative professionals.

For Designers: Creating Brand Mascots and Consistent Storyboards

Imagine you're designing a new mascot for a tech startup. You can generate the perfect initial concept image, then use --cref to place that exact mascot in dozens of different marketing scenarios: featured on a website, in a presentation, or in a social media post. By adding --sref with an image of your brand's color palette, you ensure every single generation is perfectly on-brand.

For Developers: Generating Character Sprites and Game Assets

For game developers, this system is a game-changer. You can design a main character and then use --cref to generate a complete sprite sheet.

For instance, you could run a series of prompts:

  • pixel art character, idle animation --cref <URL> --sref <URL to pixel art style>
  • pixel art character, walking animation --cref <URL> --sref <URL to pixel art style>
  • pixel art character, jumping animation --cref <URL> --sref <URL to pixel art style>

For developers looking to integrate this process into an automated content pipeline, using an API becomes essential. Services like the Midjourney API on imaginepro.ai allow you to programmatically submit these complex prompts, enabling the rapid generation of thousands of consistent game assets without manual intervention in Discord.

Frequently Asked Questions (FAQ)

Here’s a quick reference and answers to common questions.

Quick Reference Cheat Sheet

ParameterSyntaxValuesPurpose
Character Reference--cref <URL>A valid image URL.Copies the identity of a character from a reference image.
Character Weight--cw <number>0-100Controls the strength of the character reference. 100=full copy, 0=face only.
Style Reference--sref <URL>A valid image URL.Copies the aesthetic and style from a reference image.

What is the difference between Character Reference (--cref) and Style Reference (--sref)?

To put it simply: --cref controls the subject (the "who"), while --sref controls the aesthetic (the "how"). Use --cref to ensure the same person appears in all your images. Use --sref to ensure all your images share the same artistic style, like "watercolor" or "vaporwave."

Can I use multiple reference images?

Yes. You can provide multiple URLs for both --cref and --sref. For --cref, this will "blend" the features of the different people. For --sref, this will blend the different aesthetics. Example: --sref <URL1> <URL2>

Does the Omni-Reference system work with older Midjourney versions?

No. The --cref and --sref parameters were introduced with the V6 and Niji 6 models and are a core part of the new Omni model. They are not backward-compatible with V5.2 or earlier versions. For full functionality, ensure you are using the latest models. For more details, you can always check the official Midjourney Model Versions documentation.

Conclusion & Key Takeaways

Midjourney's Omni-Reference system, powered by the --cref and --sref parameters, is a monumental step forward for generative AI. It directly addresses the critical need for consistency, transforming Midjourney into a viable tool for professional design and development workflows.

By understanding how to use Character Reference for subject identity, Style Reference for artistic consistency, and combining them for ultimate control, you can elevate your creative output from random generations to cohesive, professional-quality projects. The era of one-shot, disconnected AI images is over; the future is about building consistent visual worlds. Now it's your turn to experiment and see what you can build.

Read Original Post
ImaginePro newsletter

Subscribe to our newsletter!

Subscribe to our newsletter to get the latest news and designs.