Stable Diffusion Technique: Achieving Consistent Clothing Changes with AI

2023-06-28 02:32:37
The topic to be discussed in the preface is similar to the advanced Stable Diffusion — dressing and changing technique, but you need to use a more stable method to change the clothes into what you want. When making consistent pictures and comics, there are two biggest problems. One is the consistency of human faces, which can be achieved with LoRA, and the other is the consistency of clothes. At present, there is no general solution to this problem. . In the past, it could only be realized by using prompt words, but the things that came out of the prompt words were not necessarily what we wanted. At this time, there is a method that can strongly require AI to draw clothes according to our instructions. Although it cannot reproduce all the patterns of clothes 100%, But in many cases it can handle it, such as comic strips or novel illustrations. Question We have a picture where the character’s skirt is a white pleated skirt: Gorgeous Hana But I want her to change into a red pleated skirt: Method At this point, we are going to use a technique called photo bashing to replace the clothes. First, open a drawing software such as Photoshop or GIMP, put the red pleated skirt on the character picture, and cut off the excess part, so that it roughly fits the original character’s body curve: Photo bashing, paste the red skirt on the original picture and then This composite image is passed to img2img or Inpaint. Take me as an example, I pass the image to Inpaint and select the skirt. In the prompt word column, if it is img2img, you need to use the complete prompt word. If it is Inpaint, you only need to focus on the prompt word of the clothes to be repainted. In my example, it is a red short pleated skirt (red short pleated skirt) : (masterpiece, top quality, best quality, official art, beautiful and aesthetic:1.3), extreme detailed, Hana, red short pleated skirt, fantasy lora:Hana25:0.4 and use a medium weight of Denosing strength 0.5: Next, to turn on ControlNet, use normal (normal map), and use the unmodified original image as the data source, so as to confirm that the clothes will be close to the original clothing shape of the character when redrawing: Use the unmodified image as the normal map source Finally, use Loopback to continuously feed and run the image, let the AI ​​​​continuously loop input, redraw the same part, use the Final denoising strength of 0.75 high weight so that the AI ​​​​can have enough strength to redraw the image, and at the same time use the normal of ControlNet to Constraints redraw the shape of the item: This way you can start running the map! In the resulting pictures, we can choose the desired result arbitrarily: It can be seen that in the first two steps, the pasted clothes can not reflect the normal clothing texture and shadow, but in the third and fourth pictures At that time, there are already very stable results! This method is suitable for when the shape of the original image and the clothes are similar, but the colors are different. If the clothes are far from the original clothes, such as changing a narrow skirt to a pleated skirt, you can try to lower the Control Weight of ControlNet normal, but increase the number of rounds of Loopback, and choose a satisfactory result picture. I wish you all a happy AI calculation!refer to

1687997691
#Advanced #Stable #Diffusion #Collage #Dressing #Technique

Leave a Comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.