Page 1 of 1
How to inject win32 keys : source code
Posted: Fri Dec 09, 2005 21:33
by Pacome
Hi all,
here is some code which could be useful for people who want to inject win32 keys to CeGUI. I saw that some people need it on the forum, then I share..
Code: Select all
LRESULT CALLBACK WndProc(HWND wnd, UINT msgType, WPARAM wParam, LPARAM lParam)
{
switch( msgType )
{
case WM_KEYUP:
CEGUI::System::getSingleton().injectKeyUp( virtualkey2scancode(wParam, lParam) );
break;
case WM_KEYDOWN:
CEGUI::System::getSingleton().injectKeyDown( virtualkey2scancode(wParam, lParam) );
break;
case WM_CHAR:
CEGUI::System::getSingleton().injectChar( static_cast<CEGUI::utf32>(wParam) );
break;
}
}
and the function you need for the conversion:
Code: Select all
UINT virtualkey2scancode(WPARAM wParam, LPARAM lParam)
{
if(HIWORD(lParam) & 0x0F00)
{
UINT scancode = MapVirtualKey(wParam, 0);
return scancode | 0x80;
}
else
{
return HIWORD(lParam) & 0x00FF;
}
}
It works well for all common and special keys.
Just an advice: it would be great to document a little bit more the way you can inject CeGUI inputs.
This lib is great, and it could disappoint some people to find this kind of documentation (the link to "The Beginner Guide to Injecting Inputs" is broken or doesn't exist on the tutorial website).
Best regards,
Pacôme
Re: How to inject win32 keys : source code
Posted: Wed Dec 14, 2005 17:45
by stephb7
I have not implemented my keyboard code yet but it will use the following functions to handle multilingual input:
HKL hklKeyboardLayout = GetKeyboardLayout(0); // 0 means current thread
// This seemingly cannot fail
// If this value is cached then the application must respond to WM_INPUTLANGCHANGE
BYTE diks[256];
HRESULT hr = mlpDIKeyboard->GetDeviceState( sizeof(diks), diks);
// char mKeyboardBuffer[256];
// hr = mlpDIKeyboard->GetDeviceState( sizeof(mKeyboardBuffer), (LPVOID)&mKeyboardBuffer);
UINT uiScanCode = MapVirtualKeyEx( diks[DIK_A],
0, // Convert DIK_ code to scan code
hklKeyboardLayout);
if(!uiScanCode)
// Error converting to a scancode
BYTE buffer[2];
int iAscii = ToAsciiEx( DIK_A,
uiScanCode,
diks,
buffer,
0, // 1 means a menu is active, 0 means no menu
hklKeyboardLayout);
if(iAscii > 0)
{
// We have 1 or 2 characters in our buffer
// If we have 2 characters then a dead key could not be combined
// with the virtual key to form a single character. But is the
// first digit the virtual key or the dead key?
}
else if(iAscii == 0)
{
// Virtual key could not be translated
}
else if(iAscii == -1)
{
// The virtual key corresponds to a dead key (accent or diacritic)
// Dead keys require an additional input to be resolved into a value
}
Re: How to inject win32 keys : source code
Posted: Thu Jan 12, 2006 20:10
by stephb7
Here's my code to convert a DirectInput key into a CEGUI utf value. It sort of works.
CEGUI::utf32 keycodeToUTF32( UINT keyCode)
{
CEGUI::utf32 utf = 0;
// Retrieve the keyboard layout in order to perform the necessary convertions
HKL hklKeyboardLayout = GetKeyboardLayout(0); // 0 means current thread
if(hklKeyboardLayout == 0)
return utf;
// Retrieve the keyboard state
BYTE keyboardState[256];
if (GetKeyboardState(keyboardState) == FALSE)
return utf;
/* 0. Convert virtual-key code into a scan code
1. Convert scan code into a virtual-key code
Does not distinguish between left- and right-hand keys.
2. Convert virtual-key code into an unshifted character value
in the low order word of the return value. Dead keys (diacritics)
are indicated by setting the top bit of the return value.
3. Windows NT/2000/XP: Convert scan code into a virtual-key
Distinguishes between left- and right-hand keys.*/
UINT virtualKey = MapVirtualKeyEx(keyCode, 3, hklKeyboardLayout);
if (virtualKey == 0) // No translation possible
return utf;
/* 0 means that there is no menu active
Return values:
0. No translation available
1. A translation exists
2. Dead-key in buffer[1] could not be combined with character
The value in buffer[0] is invalid */
unsigned char buffer[2];
if (ToAsciiEx(virtualKey, keyCode, keyboardState, (LPWORD) buffer, 0, hklKeyboardLayout) == 1)
utf = buffer[0];
return utf;
}
When I place my keyboard in French Canada mode the / key now generates a é. My function correctly converts the DirectInput scancode into the utf32 value of 233, which corresponds to E9 in hexadecimal, the "Latin Small Letter E with Acute" according to Character Map (charmap.exe) which comes with Windows, with the Arial font.
The problem is that I'm unable to view this letter. I'm setting the font to Arial with the following:
mGUISystem = new CEGUI::System(mGUIRenderer, (const CEGUI::utf8*) "Player_Data/Logs/CEGUI_Interface.log");
CEGUI::Logger::getSingleton().setLoggingLevel(CEGUI::Informative); // Standard, Errors, Informative, Insane
CEGUI::Font* guiFont = CEGUI::FontManager::getSingletonPtr()->createFont("Arial", "c:/windows/fonts/arial.ttf", 12, CEGUI::FontFlag::Default);
mGUISystem->setDefaultFont(guiFont);
And creating the text box as:
CEGUI::Editbox* editBox = (CEGUI::Editbox*) CEGUI::WindowManager::getSingleton().createWindow("TaharezLook/Editbox", (CEGUI::utf8*)"Editor");
mEditorGuiSheet->addChildWindow(editBox);
editBox->setPosition(CEGUI::Point(0.35f, 0.55f));
editBox->setSize(CEGUI::Size(0.3f, 0.1f));
editBox->setFont("Arial");
And injecting the characters with:
CEGUI::utf32 utf = keycodeToUTF32(arg.key);
CEGUI::System::getSingleton().injectChar(utf);
arg.key contains the value of 0x35, DIK_SLASH. Is there something else I'm supposed to do?
Re: How to inject win32 keys : source code
Posted: Thu Jan 12, 2006 20:19
by lindquist
Remember to add the glyphs to the font. by default a font only loads the ascii glyphs. and as you load the font entirely in code I assume you'll want to set the glyphs from code as well.
So either do it by passing the glyphs utf32 codepoints numerically, or use a string and save your code in utf8
Re: How to inject win32 keys : source code
Posted: Thu Jan 12, 2006 21:13
by stephb7
Thanks, specifying
guiFont->defineFontGlyphs(230, 250);
displayed my é but made me lose every other character. I'll look into creating a multi-lingual file to enable acceptable characters.
Re: How to inject win32 keys : source code
Posted: Thu Jan 12, 2006 21:32
by stephb7
I forgot to mention that my keycodeToUTF32() function is only able to convert single key characters. The letter é is obtained by pressing / on a french canadian keyboard, the letter É is obtained by pressing both the SHIFT and the / keys. The SHIFT, CONTROL, and CAPS-LOCK are not considered keys in this case.
However generating the è or È keys require first pressing the right-ALT and the [ keys, releasing both, and then pressing the e or the SHIFT-e keys. These letters are considered as requiring two keys (a dead key and a real key). My function cannot convert this combination into a corresponding character.
Re: How to inject win32 keys : source code
Posted: Sat Jan 14, 2006 19:28
by jacmoe
Code snippets like this are bound to vanish quickly, being buried in a CEGUI Forum...
Consider logging in to the CEGUI Wiki and add it there.
Re: How to inject win32 keys : source code
Posted: Mon Jan 16, 2006 13:22
by stephb7
I finally figured out how to handle dead keys. Wikied here:
http://www.cegui.org.uk/wiki/index.php/ ... EGUI_utf32
Hope this helps.
Posted: Sat Feb 18, 2006 07:46
by pjcast
I just want to thank you for this code snippet (and Rackle helped to).. This helped me integrate Lanugage Nuetral charcter translations into my Input Lib (OIS
http://www.sourceforge.net/projects/wgois)... which now uses this on Win32 (and X API on Linux) for translating keys to text.
Posted: Sat Feb 18, 2006 14:36
by Rackle
stephb7 and Rackle are the same person; I switched to this same name on various message boards.
Posted: Sat Feb 18, 2006 17:32
by pjcast
Posted: Fri Mar 31, 2006 17:26
by Mystique7
Cheers Pacome.. the code really helped!
Bump!