8 Fixes for a Second Monitor Not Detected on Windows

Quick Links

Key Takeaways

If your secondary monitor isn't being detected, you can force Windows to detect it, check all of your connections, reboot the PC, try an alternative connection, and update your graphics drivers to fix it.

A second monitor is one of the most potent productivity boosters for Windows users, but sometimes your computer fails to see the extra display. This is a common issue most people will run into eventually, but fixing it is easy!

Force Detect Displays

While automatic display detection usually works without a hitch, you can explicitly tell Windows to look for new attached displays. This is especially useful with some older connection standards such as VGA or DVI. To detect displays in Windows 11, you can follow these steps.

Right-click on the desktop and select "Display Settings" from the context menu.

In the "Display Settings" window, you will see a diagram of displays currently connected to your computer. If your second monitor is not detected, it won't appear.

Now click "Detect" and any available monitor will pop up.

If you're using Windows 10, the process is exactly the same. You'll find this window under Start > Settings > Display.

If no monitor appeared for you, let's move on to further troubleshooting steps.

Check That Everything Has Power

You should ensure that everything is connected to a power source and turned on. If the monitor or any in-between device, such as an active HDMI repeater, isn't getting power, then Windows won't detect the display.

Speaking of which, if you have a complicated setup with in-between devices such as a USB dock or HDMI repeater, try connecting the display without any of those devices in the chain; this will help you narrow down whether the problem is with one of those gadgets rather than with Windows or the monitor itself.

Look for Misplaced Connections

Double-check that everything is plugged in where it's supposed to be. A common reason for this problem is plugging the monitor into the wrong display output. Most computer motherboards at have an HDMI or other display output, but if you're using a discrete graphics card, you need to plug the monitor into the graphics card's output and not the motherboard! If you're not sure whether you're using a graphics card, just try every connector port you see.

Related: 4 Common DisplayPort Problems, and How to Fix Them

Confirm the Display Is Set to the Right Input

Most modern computer monitors and virtually all TVs have multiple inputs. If you've got your PC plugged into HDMI 1, but the monitor is set to HDMI 2, for example, then you won't see anything on-screen.

Refer to your display's manual on changing sources, and then switch to the source your computer is actually connected to. Alternatively, you can manually move the cables to different inputs or disconnect all other sources except for the one you want to use.

Reboot the Computer

If everything is plugged in and powered on, and you've double-checked that it all makes sense, reboot the computer with everything in place. For help check out our guides to rebooting a Windows 10 PC or a Windows 11 computer.

Some computers won't detect a monitor if it's plugged in after booting up. So a simple reboot can get things working as they should.

Use a Different Port, Cable, or Connection Type

If your graphics card and display both support an alternative display type (like HDMI, DVI, or DisplayPort), you can try switching to an alternative display technology or stick with the same technology but try a different port or cable.

Sometimes shuffling things around seems to resolve the problem without any clear explanation. Unplugging and re-plugging a cable into the same port can even trigger detection, so don't be shy about randomly trying things.

Use a USB Adapter

You can use a USB to HDMI or USB to DisplayPort adapter to bypass your GPU outputs altogether. This may not always be the best solution for high-end needs such as high-resolution or high frame-rate gaming since you'd be risking problems like latency. But if you need a second display output for PowerPoint or desktop multitasking, it's just as good as directly plugging things into your GPU.

Update Your Display Drivers

Buggy display drivers can cause problems with basic functions such as multi-display support. So pay your graphics card manufacturer's website a visit and ensure you've got the latest version of your GPU driver installed.

While at it, you may want to check if any Windows updates are also pending. Getting all the relevant software up to date might fix a bug that no amount of fiddling around would ever fix, so don't forget to do this if nothing else we listed above works. Updating a Windows 11 computer and updating on Windows 10 looks a little different, so check out our guides on that to make sure you get the updates you need.

ncG1vNJzZmivp6x7qbvWraagnZWge6S7zGhvbm1mbH1wssixnKxllqS%2Fbq2MrJycp56Zeq67zaKrqKpdo7y1ecOeq56bpJqxbrvNZq6ippSkxLR7