Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

Status
colourRedGreen
titleIn ProgressUP TO DATE

Table of Contents

Applications

You can check the applications list and their properties. You can either edit an application by clicking on the pen or delete it by clicking on the trash bin (the pen and the trash bin appear when you move your cursor on the application line).

To add a new application which is not yet included in the list, just click on the button "Add an application".

If you click on an application, you can view its detailed analysis, spanned in 5 tabs:

  • Dashboard

  • Meter

  • Test Results
  • Budgets

  • Evolution


Dashboard

The Dashboard tab gives you a summary of your application’s results. You can select the version of the application by clicking in the box right next to its name.

Eco-score

The global eco-score is an indicator of your level of eco-design for the application. It is the average of eco-scores for the following domains : Network, Client resources.

All eco-scores are noted between 0 and 100. To get the highest score, it is important to launch all analyses.

Two metrics are indicated with the eco-score:

  • the evolution of eco-score since the previous version of the application;

  • the average of eco-scores of your business.

Eco-score by domain

  • Network: evaluate the level of eco-design of exchanged requests between client and server. It is calculated by analyzing the requests and their content. To get a score, it is necessary to initiate measures with the TestRunner by activating the http_request option.
  • Client resources: evaluate if your application is client-side resources efficient. It is calculated by the client-side resources consumption (energy, CPU, memory…​). To get a score, it is necessary to initiate measures with the TestRunner on a smartphone.

Improvements summary

This is a small bar graph summarizing the number of rules by priority. You can filter the rules in the Improvements table below by clicking on the corresponding priority. To display again all the rules, select "All the rules" in the select box on the right of the table.

Resources impact

This graph summarizes the impact of the application usage on the mobile device.

Improvements

This is the list of all the rules that were checked during analysis: rules that are correctly respected appear in green, whereas rules that are violated appear in yellow, orange or red. They are prioritized according to earnings you can get. Each rule is classified according to its domain. The number next to the domain corresponds to its eco-score.

Each rule is associated with a priority, a score and a gain.

  • The Priority is an indicator to help you to prioritize your work and is directly correlated with the gain.

  • The Score is an indicator of how good the rule is verified on the application. Adding up all scores of a specific domain gives this domain’s eco-score.

  • The Gain is the difference between the maximum score and the actual score. It is correlated with the priority to help you choosing which rules you should work on at first place.Consumed resources

Summary of the consumed metrics of :

  • Mobile url : Platform Discharge, Process Data, Process Memory, Process CPU

  • PC url : Platform Energy

  • Mobile apk : Platform Discharge, Process Data, Process Memory, Process CPU

The difference between the last two versions of the metric is shown on the left of the value.

Meter

On the left menu, search and sort your test cases by Meter metrics. When you click on a test case you can check its detailed analysis.

Metrics are sorted by metrics associated to the entire platform and metrics associated to a process or a thread (More on metrics).

You can verify the stability of your measures (See Getting Started).

Test Results

You will find the result of your tests (pass or fail).

Budgets

Budget can be set for each test and each metrices (See Getting Started for usage)

Evolution

In the evolution tab you can compare on a graphic the average measures result by versions. On the test case list, you can check their detailed analysis between two versions.


Schedule a Test

Test

This windows permit to launch benchmark tests.

  1. Select your application : The application name which is already created in Greenspector. You can add an application with "Add application"
  2. Version : The application version which is already created in Greenspector. You can add a version with "Add version"
  3. Select the environment for the test : The device to for the test. This device is on the Power Test Cloud or in your Power Test Bench
  4. Website or Mobile App : You can test a website or an application
  5. Url or path : URL to test or URL of an APK to test
  6. Iterations : nomber of execution of the benchmark
  7. Chrome or Firefox : Browser in the case of URL test

Jobs

This windows list all the job you have launch.

  • Pending : Jobs that are in pending state (Mobile not available)
  • Running : Jobs in progress
  • Finished : Jobs that are done

Once finished, several information on the finished list are available :

  • Ended at : When does the job have finished
  • Status : Failed or finish. In case of failed, your have detail information about errors with the "!"
  • Test Passed : Status of the functionnal tests.

Modules

List of available Greenspecto modules you can download.


...

This option on the menu on the left lets you manage the current users, add a new one or delete a current one.

Related articles


Filter by label (Content by label)
showLabelsfalse
max5
spacescom.atlassian.confluence.content.render.xhtml.model.resource.identifiers.SpaceResourceIdentifier@185bfa9a
showSpacefalse
sortmodified
reversetrue
typepage
cqllabel = "webinterface" and type = "page" and space = "DOCUMENTATION"
labelskb-how-to-article