The UserInterface framework is looking very promising. I'm trying to use it on WP7. I read that you are still working on supporting Touch Input. How much is supported at the moment? I can get controls to render, but they don't seem to respond
to touches at the moment.
This is what my game's Initialize method looks like:
_guiManager = new GuiManager(this.Services);
_inputManager = new InputManager(Services);
var viewPort = GraphicsDevice.Viewport;
_guiManager.InputCapturer = new DefaultInputCapturer(_inputManager);
_guiManager.Screen = new Screen(viewPort.Width, viewPort.Height);
_guiManager.Screen.Desktop.Bounds = new UniRectangle(
new UniScalar(0f, 0.0f), new UniScalar(0f, 0.0f), // x and y
new UniScalar(1f, 0.0f), new UniScalar(1f, 0.0f)); // width and height)
Is there something I'm missing?
Feb 4, 2011 at 7:11 PM
I haven't yet added support for touch input, but it's coming. The current development trunk of Nuclex.Input already has full support for touch input.
For a quick workaround: the GUI obtains all its input through its InputCapturer (you can also feed artifical input directly to the Screen by calling the Inject...() methods). You can manually translate touches to MousePress, MouseMove and MouseRelease notifications
on WP7 and you would be able to press buttons and drag things around just like you could with a mouse.