Monday, July 20, 2009

NIWeek 2009


Are you going to NIWeek 2009? The Conference program is on-line, so you can plan what sessions you want to see.

Closed Door Session

I'm not presenting any regular sessions this year, but I am holding a closed-door session to get feedback on a project that I'm working on. There are a few spots still available. The target audience is people who teach LabVIEW, either formally (teaching classes) or informally (mentoring colleagues).

If you are interested in participating, please e-mail me at my GMail account (eyesonvis at gmail.com). (Note: I will reply to every e-mail on this topic so that you know your e-mail wasn't blocked by the spam filter. If you don't receive a reply after a day, please leave a comment here to let me know).

Details
  • It's Thursday, August 6, 10AM-Noon.
  • You would need to sign an NDA (non-disclosure agreement) if you are not already under one.
  • Spots are limited, so I apologize in advance if you want to attend but I can't get you in.

Where Else I'll Be

Other places you will probably find me during NIWeek:
  • I plan to visit the LabVIEW Experts Panel in the Technology Theater on Tuesday, Noon-1PM, to hear what "insight and advice" people seek from "NI engineers working on the latest version of LabVIEW" (seeing as how I am one of those).
  • I won't want to miss the Challenge the Champions event at the Technology Theater on Tuesday from 5PM-6PM.
  • I'm really looking forward to the LAVA/OpenG BBQ at Stubb's. Tickets are still available if you want to join in.
  • Of course, there are a LOT of sessions I want to see, but NI employees only get in if the rooms don't fill up, so I can't promise which ones I'll actually be at.
I hope to see you at NIWeek! As a "thank you" for my blog readers, I'll have a few hand-crafted "Eyes on VIs" buttons to give away. If you see me there, ask for one! :-)

Friday, July 03, 2009

LVSpeak: Automating VI development through speech

Eyes on VIs is pleased to welcome its first guest blogger, Norm "The Captain" Kirchner! Norm is the first person in history to sacrifice his LabVIEW Champion status for the pleasure of working at National Instruments. (NI Employees cannot be LabVIEW Champions). Norm has been using LabVIEW for over 9 years and he is going to share his "LVSpeak" project, which (I hope you will agree) is pretty darn amazing. Thanks for joining us, Norm!

- Christina




Imagine if LabVIEW was able to read your thoughts and react to them. You just think “edit icon” and *pop* the icon editor is opened for you immediately. Imagine if every time you wanted to drop a property node for a control or group of controls, they just showed up on the diagram.

Although this ability is not implemented in LabVIEW yet, we can get darn close by using our voice and a little creative coding.

LVSpeak (LabVIEW Speak) is a very simple concept with a great deal of possibilities. It utilizes the Microsoft Speech API (SAPI) and provides an open and extendable interface to it within LabVIEW. Currently, two extensions of that architecture exist. They are ‘Quick Edit’ and ‘Speech Enabled Quick Drop’.

These extensions of LVSpeak and the LabVIEW development environment give the coder tools to greatly speed the process of code creation and modification upwards of 70%.


Note: Volume on the videos is very low, turn speakers all the way up


Video 1: Code creation


WHY

LabVIEW is an easy to use and intuitive application development environment and programming language, but there are still some basic actions that require multiple clicks and force the user to navigate through a variety menus and options.

LVSpeak was developed to take those minor but time consuming steps and reduce them to a single voice command. At the root of this entire effort is one simple premise, “A good engineer is a lazy engineer”. And until we can program LabVIEW with our minds, turning a 4 step, 3 second action into a ½ second, 1 step voice command, makes me a happy lazy engineer.

At the end of the day, any action that requires you to remove your hand from the mouse (Ctl+I), or needs more than two mouse clicks and navigation deeper than a top level context menu (Label Visible) is warranted to become a Voice Command

HISTORY

When LabVIEW Scripting was still very new in LabVIEW 7.0 and some of it’s functionality was accidentally exposed by NI, it occurred to me that you could combine this scripting ability to use LabVIEW code to write LabVIEW code along with the free Microsoft speech recognition technology and do some creative things. Although an interesting idea, integrating dll’s and ActiveX objects into G was still foreign to me and presented a barrier that caused LVSpeak to sit dormant until NI Week 2008 and the release of Quick Drop.

Almost immediately I recognized a synergy between this great new development tool in LabVIEW and the still undeveloped LVSpeak. This was only confirmed further when I was watching the coding challenge at NI Week 08. During the coding speed challenge, the creator of Quick Drop, Darren Nattinger, was slowed to a crawl when doing simple tasks like creating a constant or typing verbose function names in the Quick Drop window.

At that point I realized that all the components needed, to allow the developer to program as fast as they could imagine the code, were in place.

HOW

The how is actually more simple than I would like to admit.

There are two key components to the Microsoft SAPI that are utilized in LVSpeak

  • Grammar List
  • 'Speech Recognized’ .NET Callback Event

Within LabVIEW, two components are required to enable speech recognition in any program.

  • Load Command List (Grammar)
  • Register for speech recognized event

The way everything plays together is quite simple

LVSpeak Core starts and

  1. Initializes the Microsoft SAPI
  2. Creates a LV User event
  3. Registers a callback VI to be run when speech is detected and fire the LV User Event with the command

Programs utilizing LVSpeak
  1. Register for the Grammar Detected event
  2. Load their command list into the “Grammar”
  3. Catch the fired event and respond accordingly to the string

For speech enabled Quick Drop, the grammar list is everything in the function palette, and whenever the LVSpeak event is caught, it takes the detected string and loads it into the text box as if you had just typed it.

Quick Edit follows that same flow. The grammar list is all created Quick Edit commands pulled from an enumeration. When one of those commands is detected, it runs the corresponding code to execute that command.



Video 2: Basic detection

Video 3: Quick Edit Demo


Getting Started

To start developing LabVIEW with your voice you need to download and install

Once all these parts are installed, you should see a new item in your LabVIEW tools menu "Enable LVSpeak"


Select that option, and you should see two floating windows show up.
If you microphone is active, manually drop some controls on your front panel, flip over to the block diagram, select all the controls and say "Label Side" in a relatively monotonic voice.



Summary


If you program LabVIEW on a regular basis, begin paying attention to how long some tasks take and how often you repeat some basic tasks that could be streamlined by a little scripting automation.


Controlling LV with your voice is not just a novel idea.
It is truly a quantum leap forward in how you develop your code and a HUGE performance booster.

So wipe the dust off of your microphone or headset and get ready to take your development process to the next level.

~,~
The Captain Was Here

Norm Kirchner

PS Thank you Christina for your patience with me getting this out to you and providing a great resource for the rest of us LabVIEW nuts out in the world.

Labels: , , , ,