Get Slot Value Alexa Python

This example was ported from the PyQt4 version by Guðjón Guðjónsson.

Introduction

In some applications it is often necessary to perform long-running tasks, such as computations or network operations, that cannot be broken up into smaller pieces and processed alongside normal application events. In such cases, we would like to be able to perform these tasks in a way that does not interfere with the normal running of the application, and ensure that the user interface continues to be updated. One way of achieving this is to perform these tasks in a separate thread to the main user interface thread, and only interact with it when we have results we need to display.

This example shows how to create a separate thread to perform a task - in this case, drawing stars for a picture - while continuing to run the main user interface thread. The worker thread draws each star onto its own individual image, and it passes each image back to the example's window which resides in the main application thread.

The User Interface

We begin by importing the modules we require. We need the math and random modules to help us draw stars.

The main window in this example is just a QWidget. We create a single Worker instance that we can reuse as required.

The user interface consists of a label, spin box and a push button that the user interacts with to configure the number of stars that the thread wil draw. The output from the thread is presented in a QLabel instance, viewer.

We connect the standard finished() and terminated() signals from the thread to the same slot in the widget. This will reset the user interface when the thread stops running. The custom output(QRect, QImage) signal is connected to the addImage() slot so that we can update the viewer label every time a new star is drawn.

Amazon Echo PHP file to control Honeywell Thermostat through a python script - echo.php. Alexa would trigger “AnswerIntentHandler” where we can fetch a slot value. The slots a live deep inside handlerInput.requestEnvelope.request.intent.slotsobject. We can fetch the slot by the name slots'and then access the value slots'.value. Let’s see it in the example. The usability of the skill directly depends on how well the sample utterances and custom slot values represent real-world language use. As Alexa best practices say: 'Building a representative set. Asksdkcore.utils.requestutil.getslotvalue (handlerinput, slotname) ¶ Return the slot value from intent request. The method retrieves the slot value from the input intent request for the given slotname. Here are the examples of the python api twitter.geosearch taken from open source projects. By voting up you can indicate which examples are most useful and appropriate.

The start button's clicked() signal is connected to the makePicture() slot, which is responsible for starting the worker thread.

We place each of the widgets into a grid layout and set the window's title:

Value

The makePicture() slot needs to do three things: disable the user interface widgets that are used to start a thread, clear the viewer label with a new pixmap, and start the thread with the appropriate parameters.

Since the start button is the only widget that can cause this slot to be invoked, we simply disable it before starting the thread, avoiding problems with re-entrancy.

We call a custom method in the Worker thread instance with the size of the viewer label and the number of stars, obtained from the spin box.

Whenever is star is drawn by the worker thread, it will emit a signal that is connected to the addImage() slot. This slot is called with a QRect value, indicating where the star should be placed in the pixmap held by the viewer label, and an image of the star itself:

We use a QPainter to draw the image at the appropriate place on the label's pixmap.

The updateUi() slot is called when a thread stops running. Since we usually want to let the user run the thread again, we reset the user interface to enable the start button to be pressed:

Now that we have seen how an instance of the Window class uses the worker thread, let us take a look at the thread's implementation.

The Worker Thread

The worker thread is implemented as a PyQt thread rather than a Python thread since we want to take advantage of the signals and slots mechanism to communicate with the main application.

We define size and stars attributes that store information about the work the thread is required to do, and we assign default values to them. The exiting attribute is used to tell the thread to stop processing.

Each star is drawn using a QPainterPath that we define in advance:

Before a Worker object is destroyed, we need to ensure that it stops processing. For this reason, we implement the following method in a way that indicates to the part of the object that performs the processing that it must stop, and waits until it does so.

Get Slot Value Alexa Python Tutorial

For convenience, we define a method to set up the attributes required by the thread before starting it.

The start() method is a special method that sets up the thread and calls our implementation of the run() method. We provide the render() method instead of letting our own run() method take extra arguments because the run() method is called by PyQt itself with no arguments.

The run() method is where we perform the processing that occurs in the thread provided by the Worker instance:

Information stored as attributes in the instance determines the number of stars to be drawn and the area over which they will be distributed.

We draw the number of stars requested as long as the exiting attribute remains False. This additional check allows us to terminate the thread on demand by setting the exiting attribute to True at any time.

Get Slot Value Alexa Python Ide

The drawing code is not particularly relevant to this example. We simply draw on an appropriately-sized transparent image.

For each star drawn, we send the main thread information about where it should be placed along with the star's image by emitting our custom output() signal:

Get Slot Value Alexa Python Programming

Since QRect and QImage objects can be serialized for transmission via the signals and slots mechanism, they can be sent between threads in this way, making it convenient to use threads in a wide range of situations where built-in types are used.

Running the Example

We only need one more piece of code to complete the example:

Prerequisites:

  1. Amazon developer’s account.
  2. Amazon web services account.
  3. Basics of python programming.

Topics Covered:

  1. Understanding architecture of alexa.
  2. Lambda function to create custom skill.
  3. Invocation name, Utterances and slots
  4. Zipping the files and its dependencies.
  5. Uploading and testing the skill.
  6. Submitting the skill for certification.

1. Let us understand the basic architecture of alexa with a simple diagram

User enables the skill from amazon alexa app in device, then the skill needs to be invoked using its ‘Invocation Name‘ here in this case ‘Rail Gaadi’ is the invocation name. when user prompts with utterance, alexa recognizes the utterance and converts that into the json object. Our custom skill reads the json object and process the request and sends the response back to alexa.

2. Creating lambda function and using the sample blueprint to develop custom skill

  • Login into lambda function console and click on create function
  • function name should be as simple as possible as it would be used in amazon resource name (ARN’s)
  • AWS Lambda function calls lambda_handler function to start your service, it is just like a main method or function which AWS Lambda understands.
  • In the context of creating custom skills, there are few functions which needs to be handled in lambda_handler which are as mentioned below.
  • on_session_started,on_launch, on_intent and on_session_ended as the name suggests all the method needs to be handled inside the lambda_handler.
  • on_session_started & on_session_ended is where your session starts & ends respectively.
  • on_launch this is where you prompt user with respective welcome note or with your app details
  • on_intent is where one implement the logic.

Note: try to understand the sample code in the blueprint and life would be easy.

3. Invocation Name, Utterances and Slots

let us understand invocation name, intents and utterance one by one as this is very important for any skill to understand and respond.

Invocation Name:

  • Alexa recognizes the skills by the invocation name provided in this case – Rail Gaadi is the invocation name.
  • Alexa responds with welcome message when user does not provide any specific request. example: “Alexa, Open Rail Gaadi

Utterance:

  • A user spoken statement or sentence which alexa understands to perform some action
  • In our case, alexa responds with the output when user provides Invocation name with utterance. example: “Alexa, Open Rail Gaadi to get the status of 12345”

Slots:

  • Slots represents how data or values are handled for processing the requests.
  • Amazon provides built in slots which could be found here.
  • One can provide their own custom slots if required.

Hurry! theoretical journey has been complete. we are few steps ahead to submitting the alexa app.

  1. Login into amazon developer console.
  2. Click on the Alexa tab.

click on the Get Started button under alexa tab and user would be redirected to add skill page.

3. click on Add a New Skill button as shown below.

4. After clicking on add a new skill, we would be landing on Skill Information page here we have to provide app related information. Skill Type, Language, Name, Invocation Name and Global Fields.

  • Skill Type should be selected as Custom Interaction Model, as it is an custom skill.
  • Language is specific to region. Rail Gaadi is for indian market so i have selected English (India).
  • Name is which user would be seeing in the amazon alexa app for search. Name should be small and easy to remember.
  • Make sure invocation name and name should be same.
  • Global fields could be selected depending on the applications requirement.

5. Interaction Model is where we would be providing sample utterance, slots and intent. here i have taken another example where user would be asking for places history. It provides code editor and as well GUI it is up to the user which needs to be picked up.

Get Slot Value Alexa Python Codes

6. After building and saving the model, we would be landing on Configuration tab. this is crucial as we are going to connect to skill to lambda function code by providing the ARN value in the Default field.

Here come the tricky part of zipping the code and uploading it to lambda function

7. Before zipping the folder make sure to complete all the error handling part.

  • Create and activate virtual environment, let us name it “zipfol“.
  • virtualenv zipfol
  • cd zipfol
  • source ./bin/activate
  • create a text file and include required libraries
  • pip install -r requirements.txt
  • move site-packages in the root folder and zip the folder
  • cd zipfol
  • source ./bin/activate
  • pip install -r requirements.txt
  • mv ./lib/python2.7/site-packages/* ../zipfol/
  • zip -r zipfol.zip *

8. Uploading the zip file to lambda function. if size of zip file is greater than 10 MB then use s3 bucket. it is very simple, just login into s3 bucket console,create s3 bucket and upload the zip file and use the link in lambda function.

9. Testing our code, provide the sample utterance in the service simulator and verify for the response as shown below.

10. Publishing Information sections is very important as it would be user facing, the information provided there would be shown to the user which helps to enable and access with ease.

11. Privacy and Compliance, finally we are one step behind of submitting our code to amazon alexa store. necessary details and accept the compliance. Click on Submit for Certification.

Note: Amazon performs all the basic and necessary steps and testing of your app and if it passes all the certification process. you would be sending an email and it would be live soon.

Thank You for reading this blog. please provide your comments and please share it.