I have a consumer version of the Rift, and I’ve got it working with MonoGame by PInvoking into a C++ dll I wrote using the Oculus SDK.
It is true what gitviper says, earlier versions of the Rift acted like a second monitor attached to your PC, so you didn’t need to do anything special to render to the Rift display.
In newer versions you use an SDK function to create swapchain rendertargets for each eye (or one for both eyes).
You render your scene into these rendertargets using proper view and projection matrices. When you’re done rendering, you submit the swapchains to the Rift. The Oculus runtime then applies lens distortion, chromatic correction, time warping etc.
I found that the simplest solution is to just render to MonoGame rendertargets normally and then copy those into the swapchain rendertargets created by the SDK.
I needed to access the following things from MonoGame/SharpDX to make this work:
GraphicsDevice._d3dDevice.NativePointer
GraphicsDevice._d3dContext.NativePointer
Texture._texture.NativePointer
The device is needed so the SDK can create the swapchains.
The context and the texture are needed to perform the copy operation inside the C++ dll.