Enabling External Encoder in Microsoft Teams Live Events for Extreme Noobs

This is an exciting time with the Teams Collaboration market that got triggered by Slack and has caused giants like Microsoft and Cisco to build and introduce their own versions of Team Collaboration Solutions. Each one is trying to address this market with supposedly unique experiences. While I’m a big fan of Cisco Webex Teams for its completeness of vision, my favorite happens to be Microsoft Teams. The reason is its rebel stance it has taken against the Traditional Office Applications by not adhering to their Architecture. Instead this team (Microsoft Team’s dev team) has gone ahead with open source ecosystem to the extent possible and use the Traditional .Net/Visual C++ copy paste to a minimum. The Efficiency benefits shows up with the relatively tiny installation file in the 70-80 MB range that can be installed by the user without admin rights… this is Preposterous for any Traditional Microsoft developer! I love this open attitude and for a 1-year old software Microsoft Teams is loaded with features and keeps coming up with new features every month. I would advice you to check their twitter feed @MicrosoftTeams if you don’t believe me… In comparison, both Traditional Microsoft oldies and other competition are just too slow to come up with updating their capabilities… Unlike a traditional admin, I’m a person who like rapid change and this fluidity of Microsoft Teams is something I love!

Getting back to the topic, Microsoft recently announced the new feature called Live Events as part of their Meetings Capabilities. While the regular Meetings is for Many-To-Many Real-Time Multi-Media Collaboration……

Live Events is specifically geared for ‘Near Real-time’, ‘Some-to-Many’ Video Collaboration.

Bidirectional capabilities are restricted to text and not voice or video. On the flip side the capacity of the audience is greatly increased beyond the 250-participant limit of regular Meetings. Further capability to bring in External Encoders to make the event rich with Studio like capabilities completely blast all other competition out of the water!

If this was a audio/video blog you should be hearing a loud bomb sound now

So great features, but how do they actually perform. The Regular Live Events setup and run is pretty simple and well documented, you can check here (https://docs.microsoft.com/en-us/microsoftteams/teams-live-events/what-are-teams-live-events)for more details to get started quickly

Further links here will guide you through on how to enable live events for all or selective users. Everything can be achieved over GUI and boring and hence I’m not going to blog about here…

Now, when the time came to enable External Encoder in my lab account, I had some interesting nerdish adventure and I believe this would be of interest to someone who has just started administering Microsoft Teams and has not faced PowerShell before. If you are an IT Pro who manages Skype for Business Online on a regular basis then this article may be boring and you may want to stop reading….

For the rest of us, join me on a trip to Teams ‘PowerShell’ Wonderland

 

Getting Started

Typically, I wouldn’t have gone into this as I typically try out Office365 stuff from my desktop which is fully setup. This I tried on my new laptop with zero Office365 activity and that meant starting from scratch… Compared to the rest of Microsoft Teams administration, this one was old school and hence this blog

The first thing you need to have is a ‘Windows’ OS, preferably Windows 10 Creators Update or later… if you are something older, then you may have some other adventure in addition to what I experienced😉… Do let me know in the comments.

 

Install Skype Online PowerShell Modules

This usually is supposed to be a boring activity…Just head over to https://download.microsoft.com/download/2/0/5/2050B39B-4DA5-48E0-B768-583533B42C3B/SkypeOnlinePowerShell.Exe

Download and install….

Beyond the need for admin rights what could go wrong??? Wrong…

 

….the old world has to catch you by the throat and install its Goodies …

 

So, head back to https://aka.ms/vs/15/release/VC_redist.x64.exe

Download and install …with admin access of course…Now again try to install the PowerShell Modules

 

After this you need to ‘Restart’! Yippee!

Power of the Shell be with You

Now after Reboot and open the most favorite adventure app called Windows PowerShell… I like the ISE as it lets me interactively check documentation on modules and create scripts… You could have the same adventure as this blog with the regular PowerShell as well…

Now we need to import the modules we ‘Installed’… Other shells don’t have such needs! Why! The explanation is a bit lengthy …but google it and you should get a good answer

 

We Import the modules using the following command

>Import-Module SkypeOnlineConnector

 

This sadly results in an error!

The reason is that by default the execution policy is set to Restricted and hence Mighty Powerful magic like Import-Module is not allowed… So, we need to change to Signed…And not just Signed but to ‘RemoteSigned’ as our execution is going to happen remotely in Office365 Servers…

>Set-ExecutionPolicy RemoteSigned -Scope CurrentUser

You should be presented with a confirmation if you have enough strength to wield such mighty powers and if you want to wield it always

I usually do ‘A’ but you would be safer with ‘Y’

 

Now let’s do the Import

>Import-Module SkypeOnlineConnector

We now get something going and a confirmation appears again if all the new magic skills are something you can handle?

I’m a pro so I say ‘A’ …again if you want to be careful, then choose ‘R’

 

Now we are all loaded up…Time to do some magic…

Let’s prepare to do some magic

First authenticate ourselves… Lets get our credentials into a variable called $userCredential

>$userCredential = Get-Credential

cmdlet Get-Credential at command pipeline position 1

Supply values for the following parameters:

 

Awesome… Now create a session to build a bridge to the Ether World

>$sfbSession = New-CsOnlineSession -Credential $userCredential

> Import-PSSession $sfbSession

If you see this…then it means that It is working!

 

ModuleType Version Name ExportedCommands

———- ——- —- —————-

Script 1.0 tmp_w5fa1s0p.qns {Clear-CsOnlineTelephoneNumberReservation, ConvertTo-JsonForPSWS, Copy-C…

 

Finally! let’s do the stuff we actually wanted to do

Check what is the Broadcast Policy set globally

>Get-CsTeamsMeetingBroadcastPolicy -identity Global

 

Darn it asked for credentials again!

 

But something went wrong….

Creating a new session for implicit remoting of “Get-CsTeamsMeetingBroadcastPolicy” command…

New-PSSession : [admin3a.online.lync.com] Connecting to remote server admin3a.online.lync.com failed with the following error

message : The WinRM client cannot process the request. The authentication mechanism requested by the client is not supported by the

server or unencrypted traffic is disabled in the service configuration. Verify the unencrypted traffic setting in the service

configuration or specify one of the authentication mechanisms supported by the server. To use Kerberos, specify the computer name

as the remote destination. Also verify that the client computer and the destination computer are joined to a domain. To use Basic,

specify the computer name as the remote destination, specify Basic authentication and provide user name and password. Possible

authentication mechanisms reported by server: For more information, see the about_Remote_Troubleshooting Help topic.

At C:\Users\<removed>\AppData\Local\Temp\tmp_w5fa1s0p.qns\tmp_w5fa1s0p.qns.psm1:136 char:17

+ & $script:NewPSSession `

+ ~~~~~~~~~~~~~~~~~~~~~~~~

+ CategoryInfo : OpenError: (System.Manageme….RemoteRunspace:RemoteRunspace) [New-PSSession], PSRemotingTransportExce

ption

+ FullyQualifiedErrorId : AccessDenied,PSSessionOpenFailed

Exception calling “GetSteppablePipeline” with “1” argument(s): “No session has been associated with this implicit remoting module.”

At C:\Users\<removed>\AppData\Local\Temp\tmp_w5fa1s0p.qns\tmp_w5fa1s0p.qns.psm1:10423 char:13

+ $steppablePipeline = $scriptCmd.GetSteppablePipeline($myI …

+ ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~

+ CategoryInfo : NotSpecified: (:) [], ParentContainsErrorRecordException

+ FullyQualifiedErrorId : RuntimeException

Back to the Spell Book

A bit of googling later it turns out that Import-PSSession only imports the ingredients of our spell but the darn pentagram is stuck in the cloud! So, lets enter the cloud …

> Enter-PSSession $sfbSession

[admin3a.online.lync.com]: PS>

How do you know you are on the cloud…? You see the Command Prompt has changed! You may get a different server name…. but if you reached here…You are doing Good!

Now let’s check the global policy for TeamsMeetingBroadcast…

[admin3a.online.lync.com]: PS> Get-CsTeamsMeetingBroadcastPolicy -identity Global

Description :

AllowBroadcastScheduling : True

AllowBroadcastTranscription : False

BroadcastAttendeeVisibilityMode : EveryoneInCompany

BroadcastRecordingMode : AlwaysEnabled

Key :[{urn:schema:Microsoft.Rtc.Management.Policy.Teams.2017}TeamsMeetingBroadcastPolicy,Tenant{800fdedd-6533-43f5-9557-965b3eca76f6},Global]

ScopeClass : Global

Anchor : Microsoft.Rtc.Management.ScopeFramework.GlobalScopeAnchor

Identity : Global

TypedIdentity : Global

Element : <TeamsMeetingBroadcastPolicy xmlns=”urn:schema:Microsoft.Rtc.Management.Policy.Teams.2017″

AllowBroadcastScheduling=”true” AllowBroadcastTranscription=”false”

BroadcastAttendeeVisibilityMode=”EveryoneInCompany” BroadcastRecordingMode=”AlwaysEnabled” />

We need to specifically focus on the status of AllowBroadcastScheduling to be True… For me it is true and if you have already fiddled on the GUI Policies, then this must be true…else Please go back to the GUI Admin Centre and enable Meeting scheduling to True in the Global Policy

 

Are we there yet?

If you’ve come this far then now we are ready to do the magic we came all this way for

[admin3a.online.lync.com]: PS> Grant-CsTeamsMeetingBroadcastPolicy -Identity <type full user name here> -PolicyName $null -Verbose

 

Whoosh!

VERBOSE: Performing the operation “Grant-CsTeamsMeetingBroadcastPolicy” on target “<the username will appear here>”.

VERBOSE: Audit disabled on Cmdlet level

We finally did it!

 

How do I check?

Head back to the streams portal and click on Create drop down…the user for whom you did the magic should be able to see the ‘Live Event (preview)’

Now head back to Teams Client or Web Page and create a new Live Event Meeting and the user should be able to see the ‘External Encoder’ enabled…

Awesome! Thanks for being with me on this adventure! Now your user can configure External Encoder in their Live Events!

 

I wish the Microsoft Teams Dev Team put a little more effort and do away with this adventure and let the administrator enable/disable the External Encoder from the GUI itself… IMHO, PowerShell for this is overkill as only a few people will be given this magic gauntlet

What Next? I want more adventure…

Now may be a good time to check out Luca Vitali’s article on how to use OBS as a external encoder for your event at https://lucavitali.wordpress.com/2018/08/24/how-to-use-obs-studio-external-encoder-for-live-events/

For other more ‘Not Free’ solutions head on to https://docs.microsoft.com/en-us/stream/live-encoder-setup

All the Best!!

Onedrive for Business (16) Greedy Cache Problem

For the past week I’ve been struggling with my c: filling up quickly despite me having cleared the temporary files and even moved my ost file to a different drive… I noticed that Onedrive4B was refusing to complete its sync and then it occurred to me that probably it was the culprit… Every time i freed up some space I found the pending file count to reduce and then stall while the OS started complaining of low space in c:

After googling and a few hacks later I found the main culprit. Onedrive4B cache cannot be moved to other location other than its pre-designated location! That’s IMO a capability there in pre 14 versions but now not supported!

Anyway I went to the C:\Users\<username>\AppData\Local\Microsoft\Office\16.0 folder and to my horror saw multiple OfficeFileCachexxx.old folders with each occupying GBs of space! I just deleted all the *.old folders and my c drive gained plenty of spare GBs it started off with! Now problem partially solved… Onedrive now syncs and maintains only one copy of cache and leaves the rest of space in c: alone… But why doesn’t microsoft not allow the cache to be moved to more spacious locations? I wonder!

Rendezvous with Azure Container Instance

Microsoft released the Azure Container Instance last week and unlike the rest of the container oriented cloud solutions in the market, this is a unique solution and I couldn’t resist trying it out. The Quick start guide in https://docs.microsoft.com/en-us/azure/container-instances/container-instances-quickstart let’s us experience how it can be utilized from the az-cli console bothe from the azure website and from the laptop. The rest of the tutorials also help understand how a standalone app can be packaged, loaded into the Azure Container Registry and the run inside the ACI.

So, what is this service….

To start with, it is basically Docker Linux Instances created on call and destroyed without the tenant having to invest in a fixed pool of virtual machines which happens to be the case with other Container Services. This is not the first time and there are other players in the market providing similar services (hyper.sh, now.sh etc.) where the customers can have their own Orchestration Infrastructure on Dedicated Infrastructure and utilise the Flexible Cloud Infrastructure that can scale from ‘0’ to …a very large number. Of course, there are the server-less services provided by the big three cloud vendors…but they happen to be very opinionated and short-lived on a per invocation basis. ACI on the contrary Gives the Server-less experience in an unopinionated way and perfectly ready for limited but long running services.

To enable this capability to be controlled by customer built orchestrators, Microsoft has already released the ACI-Connector (https://github.com/Azure/aci-connector-k8s)for Kubernetes. This connector runs as a container in the Local Kubernetes Infrastructure and proxies the requests to create and destroy containers as per the developer provided yaml. From my testing this was not working on the day of launch but got fixed on 4thAug2017…Works perfectly now…. I hope that in future this capability expands to other orchestrators and adds more features.

What would be good use cases for this?

Like similar Container PaaS services available this can be used to handle worker services which need more compute time than what the server-less services can accept. Otherwise with current pricing this is not a good candidate to replace always-on VM based services.

Cloud App–Weekend #1–Azure AD Authentication on NodeJS

Its been Quite some time that I’ve been looking at a good combination on backend and front end Web platform on which to start building apps. After playing around PHP, .Net, Python and Ruby I stumbled upon NodeJS and this IMHO makes sense to begin some new app dev that is specifically “Born in and For the Cloud”. NodeJS however is only part of the stack as I wanted other products to handle the frontend, database and overall hosting, security, load balancing, Scaling and various other services we’ll need for a complete app. I choose NodeJS as my web platform as I found it to be very simple to setup on both windows and Linux. Further having JS as the language across all sides makes life a bit simpler….

Now back to my App, The first bit I want to handle is the User Administration and I found AzureAD to be very simple and straight forward as long as I don’t get into integration with on-premise AD. I would have preferred more ‘Free’ Sign-Up mechanisms like Google, Facebook and OpenID but gave then up for something that has better ‘Enterprise’ capabilities…and hence AzureAD.

Lets Create the User/s

You can open an Azure account on a Pay-As-You-Go basis for Evaluation purpose and though credit card details are taken, they don’t get charged as long as we stay out of the charged services Winking smile. The Instructions on Signup page and FAQ are very self explanatory and I’m not going to go in detail. User creation for up to 5 is free so I’ll keep my user count within limits.

By now I believe you should have created a azure subscription and have logged into the azure portal. To Create users Click on the “Browse All” Buttonimage

A window with All Services and another window with active services will appearimageClick on Active Directory to see the list of directory configured. The First time this would be empty and a default would be needed to be created to access any service image

Click on the Directory/Workspace to configure the AD Services.image

You should see the Configured users which initially would include only the Admin user you would have created by default. To Test Users create new ‘Basic’ users by first clicking on the “Add User”  buttonimage

Keep the default “New User in in your Organization” and provide a name for the userimage

In the next form give the First Name, Last Name and Display Name…

imageAn now you should be presented with the “Get temporary password” screen image…Hit “Create”.imageYou should be presented with a ‘temporary password’ for this new user…I would suggest you copy paste this password in a safe place. Next click on the user in the directory listing and from the user details listed, note down the <username>@*.onmicrosoft.com userid.

Now this user will be available for playing around from NodeJS only if we change the password. Do not try the ‘reset password’ as it will only give you one more ‘temporary password’ Sad smile…So sign out for now….and Login again with this new user’s credentials…and you will be forced to change the password…do it.imageYou should be now be able to login with this password for this user…Ensure this works before going to the next steps.

Client ID and Client Secret Creation for our App

Now for the next stage our NodeJS App will require Client ID and Client Secret…For this you will need to go back to the AD Screen and Click on the Directory/Workspace where you created the user. Now click on the ‘Applications tab’image

You should see a list of default apps listed which could be nothing in your case, so don’t be scared. We’ll create one in a few steps. I’ve already created credentials for my ‘NodeJS’ App …Lets see how you can create one…

Click on the “Add” Button in the bottomimage…Choose “Add an Application my organization is developing”image…Its ok to hide the fact that this is for your personal experimentation Winking smile

Now you will be asked for a name of the web appimage…Feel free to give a friendly name and this need not match with the NodeJS App being developed…but would be easy to use the same name. Do keep the “Web Application and/or Web API” Option selected

Now you will asked for URLs! image

I got stumped at this step as I’m developing and NodeJS within my pc and will not have a public facing URL…Well Initially I gave some thing and debugged to reveal what I should use…For further steps I will be using NodeJS within my laptop and hence these instructions are valid only for this use case. Once the app is ready and we have a public facing URL , these URLs can be updated.

So I used the Sign-on URL as “http://localhost:3000/auth/openid/return” and appID URI as the same “http://localhost:3000”….If this is the first time then I would suggest you try the same as this is the default used by the OpenID Connector we will use in our NodeJS App.

Next you should be welcomed by image

This page also has several useful links to articles related to development of app for utilizing this facility…You should read them…butClock I’m going to go on…

Click on the Users Tab and you will see the list of users enabled to use this App Credential you just createdimage. You will notice that the ‘New User’ you just created has been marked as ‘No’ under Assignment! Lets go ahead and change that to ‘Yes’. Click on the User and you will notice a ‘Assign’ button in the bottom..imageClick it. Confirm that You do want to assign this user to this app. It will take some time to Azure do some magic and then you will see the refreshed list of users with the new user marked as ‘Yes’. Now lets go get our Client ID and Secret. Click on the ‘Configure’ Tabimage

You should now see the ‘Client ID’ …Copy it as store in a safe place, as we’ll need it in our App later. Now where is the Client ‘Secret’!

Scroll down and you will see the ‘Keys’ section..This is where our ‘Secret’ is…but it is empty…

We got to create our new secret so pull down on the Duration boximageand select 1 year or 2 years….your choice…Click on the ‘Save’ Button in the bottom to reveal the ‘Secret’…Copy it and save in a safe place along with the Client ID…Keep the ‘Secret’ safe as if you miss it then the only way to proceed would be to create a new key! Which at this stage is not a big deal but when dealing with production settings…Could be a Big ProblemSmile

The NodeJS App

Now It is safe to close the browser and sign-out…We will continue in out local PC. Install NodeJS…It’s a simple solution and instructions are very simple for both Linux and Windows. I would also recommend installing GIT for the next steps. Again this too is too simple and we can proceed with the default installation without any problem.Just ensure that the following command execute then called

  • node
  • git
  • npm

If they don’t execute probably the installation did not go well…Ensure they work before proceeding. Also do ensure internet connection is working and you are not behind corporate firewalls.

Now create a folder for our app…from inside this folder from command line type

  • $ git clone AzureADSamples/WebApp-OpenIDConnect-NodeJS.git

This will download the OpenID Sample App in the folder. Change Directory into the WebApp-OpenIdConnect-nodejs folder and type

  • $ npm install

NPM will now do its magic and download all the dependencies that this sample app has and download into this folder under node_modules folder

image

Now Open the Config.js file and you will notice two items where there are no details…Yup clientID and ClientSecret….Now is the time to go back to your safe place and copy the Client ID and Client Secret and Paste them hereimage

Now as per the developer of this AzureAD connector it would be recommended to install Bunyan to make sense of the logs being thrown out…But this is optional. I’m keeping it out for now. The below will run the NodeJS app

  • $ node app.js

image

Now Open your Webapp on your local browser on port 3000. This port can be changed inside app.js in line 192image

Click “Log in”…and you will be taken to AzureAD’s login page….Login with your user id and you should return on the user page in this tiny appimage

That’s it …the app works and the first part is over…

Hope it worked for you too SmileHigh fiveComputer