After describing the Call Quality Dashboard (CQD) QoE Archiving Database on part 1, I will show now how to install the CUBE component and how it works on the solution.
The CUBE is “where data from QoE Archive database is aggregated for optimized and fast access” by the Portal component: this is the ‘data crusher’
The CUBE is a SQL Server Analysis Service (SSAS) or generically known as an online analytical processing (OLAP).
Installing CQD – QoE CUBE
Before performing the installation the following pre-requisites need to be in place:
- You need a server with SQL Server Analysis Services (SSAS) installed. The following picture (all-in-one example) shows the required SQL components for CQD installations
- It’s recommend to create a dedicated domain service account to grant the least required privilege to it. This account is used to trigger the cube processing.
- The QoE Archiving Database needs to be already deployed.
- You need to run the installation on the SQL server where the QoE Archive Database was installed. This is because some files will be installed and used by the SQL Agent.
The installation package is the same for all CQD components so, if: (a) you are installing all components you can go to step 2; (b) if you already installed the QoEArchiving on the same server, go to ‘programs and features’ and ‘change’ the package and proceed to step 2:
- Proceed throw the welcome screen, licence agreement, and choose the binaries install location
- For this part I will select the QoE CUBE and proceed to the configurations screen
• QoE Archive SQL Server Instance: SQL Server instance name for where the QoE Archive DB is located. To specify a default SQL Server instance, leave this field blank. To specify a named SQL Server instance, enter the instance name
• Cube Analysis Server: SSAS server and instance name for where the cube is to be created. This can be a different machine but the installing user has to be a member of Server administrators of the target SSAS instance.
• Use Multiple Partitions: ‘Multiple partitions’ requires Business Intelligence edition or Enterprise edition of SQL Server. ‘Single Partition’ only requires for a Standard edition, but cube processing performance may be impacted.
• Cube User – User Name & Password: Domain service account that will trigger the cube processing.
- After the validations the installation will ask to proceed until completion, hopefully without any error
Behind the CQD QoE CUBE
What happened and was configured after the previous installation steps?
This component setup installed some specific files, created a SSAS database and made some updates on the QoE Archiving Database:
• QoECube database was created;
• ‘Cube User’ login created and assigned db_datareader and db_datawriter on the QoEArchive
• a credential created with the ‘Cube User’. This will be used to impersonate the connection to the QoECube to the source SSAS server.
• A linked server source, mapping all the databases on the source SQL server
• A 2nd step on the SQL Agent Job (created by the QoE Archive) and a proxy. This is the ‘brain’ that will trigger the cube.
• The files used by the agent to trigger the cube
Known ‘caveats’ regarding the installation and architecture process:
- The script command ‘process.bat’ to trigger the cube process overwrites the error log ‘process.log’ at every execution. Since the Agent execution is ran every 15 minutes you might not catch a cause/history of past errors.
As quick workaround, you can change the script command to pipe and add (>>) the output to the existing log file:
“%~1QoECubeService.exe” “%~1cubeModel.xml” >> “%~1process.log”
- Don’t use a domain user account password starting with ‘+’. The setup SQL procedure will ignore it and then you will get the following on the SQL job and the cube trigger will not start:
“Unable to start execution of step 1 (reason: Error authenticating proxy LAB\service.cube, system error: The user name or password is incorrect.). The step failed.”
How to manage and monitor the CQD QoE CUBE process ?
The main CUBE processing is triggered using the same SQL Agent job created by the QoE Archiving. A second step is added to the job and whenever there is new data synchronized from the QoEMetrics to the QoeArchive, the job will launch a command script:
Execution errors will be logged on the SQL agent log and details can be found on the file ‘process.log’ generated on the same folder as the command script.
Now you have a replica of your QoE data, a tool to process analyse it. You now need an interface to visualize and modulate described on part 3.
There is a way to script the previous installation in one single command line (you just need to replace the orange text with your settings):
Msiexec /i “CallQualityDashboard.msi” ADDLOCAL=QoECube REBOOT=ReallySuppress CQD_INSTALLDIR=”D:\Skype4B\CQD” CUBE_ARCHIVE_SERVER=”LYNC-CQD.my.lab\CUBE” DISABLE_CUBE_MULTIPLE_PARTITION=”true” CUBE_ANALYSIS_SERVER=”LYNC-CQD.my.lab\CUBE” CUBE_USER=”LAB\service.cube” CUBE_PASSWORD=”WhoKnows?” /qb!
- You still need to run this it on the server holding the QoE Archiving database (it needs to install the agent script files)
- Be sure to use lowercase ‘true’ or ‘false’ on the parameter.
It will write ‘as is’ this value on the cubeModel.xml file, and the Agent job will fail and you will see an error on the ‘process.log’:
Error while Processing: There was an error deserializing the object of type Microsoft.Rtc.Qoe.Cqd.QoECubeService.CubeProcessModel. The value ‘True’ cannot be parsed as the type ‘Boolean’.
You can fix this by ‘lowercasing’ the value of the parameter <DisablePartitioning> on the cubeModel.xml