Cognitive Services Tutorial #5: Test the app locally

You will run the app locally in order to test it and familiarize yourself with its features. Running it locally is a simple matter of firing up a Node.js server process to host your server components (in this case, server.js) and pointing your browser to http:locahost:port, where port is the port number on which the server process is listening HTTP requests.
server.js listens on port 9898. You can modify that if you would like by changing line 7 in the code.

1. Return to the Command Prompt or Terminal window and, once more, make sure you’re in the “Intellipix” directory that you created for the project. The execute the following command to start server.js:

 node server.js

2. Open your browser and navigate to http://localhost:9898.

3. Click the Browse button and upload one of the images. After a few seconds, a thumbnail version of the photo appears on the page:

4. Upload a few more images. Confirm that they appear on the page, too;

5. Hover the cursor over one of the image thumbnails. Confirm that a tooltip appears containing a caption for image. This is the caption that was generated by the Computer Vision API and stored in blob metadata.

6. Click the thumbnail to display an enlarged version of the image in a lightbox. Confirm that the computer-generated caption appears at the top of the lightbox. Then dismiss the lightbox.

7. Upload several more photos. Feel free to upload photos of your own.

8. Type a keyword describing something you  see in the images –  for example “river” – into the search box. Search results will vary depending on what you typed and what images you uploaded. But the result should be a filtered list of images  – images whose metadata keywords include all of part of the keyword that you typed.

9.Do a View Source in your browser to view the source for the page. Find the <img> elements representing the image thumbnails. Observe that the URLs assigned to the images refer directly to blobs in blob storage. This is possible because you set the containers’ Access type to Blob, which makes the blobs inside them publicly accessible.

10. Return to the Command Prompt or Terminal window and press Ctrl+C to stop the Node server.

11.Return to the Microsoft Azure Storage Explorer (or restart if it you didn’t leave it running) and click the “photos” container under the storage account you created in Exercise 1. The number of blobs in the container should equal the number of photos you uploaded. Double-click one of the blobs to download it and see the image stored in the blob.

12. Open the “thumbnails” container in Storage Explorer. How many blobs do you see there? Open one of the blobs to see what’s inside. These are the thumbnail images generated from the image uploads.

13. Want to see where the metadata generated by the Computer Vision API is being stored? Open the “photos” container again. Right-click any of the blobs in the container and select Properties. In the ensuing dialog, you’ll see a list of the metadata attached to the blob. Each metadata item is a key-value pair. The computer-generated caption is stored in the item named “caption,” while the keywords generated from the image are stored in a JSON string array named “tags.”

Leave a Reply

Your email address will not be published. Required fields are marked *