How do I trigger a screen change? Is there an easy way to tell in my logic which screen is showing so that I can manage button functionality? I ask because this is the first display software I've used where button function isn't directly tied to which screen you're on - which is fine if there's an absolute way of knowing which screen I'm on.
There are two ways I know of: Screen control signals and screen output signals.
Each screen is a separate component (yet another rectangular box) in your application program, and each have a boolean "SHOW" input. I usually make a screen-management page that contains all the control logic for managing a bunch mutually-exclusive screen-enable signals, one per screen, which are then used to enable a screen as well as enable the control logic for certain button changes.
Each screen definition can also produce outputs signal, which is something new. The individual objects within a screen have a a boolean status flag named "Is Visible", which you would have to tie to a signal on the OUT bus. I haven't used these yet. Seems a bit redundant to me, because the condition that enabled the screen object should exist at the input side as well.
The biggest change to wrap my mind around was how you first create screen "definitions", before they actually exist anywhere in an application page. Then you drop a "show screen" component into your application page and select which of the screen definitions that component will display. Just because you've made a screen definition doesn't mean your application needs to use it. And you can (in theory) use the same definition in multiple show-screen components, as long as the IN/OUT buses have the same signal names. This could be useful for something like a common valve block or multiple engines, where you have identical signal buses for each valve block or engine and effectively just pick which bunch of data the screens show. I haven't implemented this in a project yet.
The second biggest difference is the management of bus signals, which is now a two-step process to get the signals to your screen editing window. You need to "
Query" (not "
Enter") each screen component and then set up which of the signals from the IN bus get sent to the screen. Pretty weird and tedious, and unlike any other HMI software I've worked in.
Beware the disproportionate use of object-oriented software terminology in the manual, which I personally feel is unnecessary and confusing, even though I write OO software.
On that note, here's a little screen shot of an example which I tossed together from an existing project by pasting the relevant parts together on the same page for your viewing pleasure.