IN PROGRESS
Usage
After logging in, you can select one of the following functionalities in the menu on the left:
Applications - upload, manage and view the detailed analysis of your applications;
Schedule a Scan - launch an analysis of your source code and see jobs processes;
Rules - check a list of possible problems with its detailed description and suitable solution in order to allow a gain in energy, memory and performance.
If you are admin you can also access to:
Administration - check and manage test times, users (add specific roles, check their last connection, add new users or delete existing ones), license information.
To change your user preferences or manage your private token, just click on your user name located on the top-right corner and select Preferences
.
Applications
You can check the applications list and their properties. You can either edit an application by clicking on the pen or delete it by clicking on the trash bin (the pen and the trash bin appear when you move your cursor on the application line).
To add a new application which is not yet included in the list, just click on the button "Add an application".
If you click on an application, you can view its detailed analysis, spanned in 5 tabs:
Dashboard
Meter
- Test Results
Budgets
Evolution
Dashboard
The Dashboard tab gives you a summary of your application’s results. You can select the version of the application by clicking in the box right next to its name.
Eco-score
The global eco-score is an indicator of your level of eco-design for the application. It is the average of eco-scores for the following domains : Network, Client resources.
All eco-scores are noted between 0 and 100. To get the highest score, it is important to launch all analyses.
Two metrics are indicated with the eco-score:
the evolution of eco-score since the previous version of the application;
the average of eco-scores of your business.
Eco-score by domain
- Network: evaluate the level of eco-design of exchanged requests between client and server. It is calculated by analyzing the requests and their content. To get a score, it is necessary to initiate measures with the TestRunner by activating the http_request option.
Client resources: evaluate if your application is client-side resources efficient. It is calculated by the client-side resources consumption (energy, CPU, memory…). To get a score, it is necessary to initiate measures with the TestRunner on a smartphone.
Improvements summary
This is a small bar graph summarizing the number of rules by priority. You can filter the rules in the Improvements table below by clicking on the corresponding priority. To display again all the rules, select "All the rules" in the select box on the right of the table.
Resources impact
This graph summarizes the impact of the application usage on the mobile device.
Improvements
This is the list of all the rules that were checked during analysis: rules that are correctly respected appear in green, whereas rules that are violated appear in yellow, orange or red. They are prioritized according to earnings you can get. Each rule is classified according to its domain. The number next to the domain corresponds to its eco-score.
Each rule is associated with a priority, a score and a gain.
The Priority is an indicator to help you to prioritize your work and is directly correlated with the gain.
The Score is an indicator of how good the rule is verified on the application. Adding up all scores of a specific domain gives this domain’s eco-score.
The Gain is the difference between the maximum score and the actual score. It is correlated with the priority to help you choosing which rules you should work on at first place.
Consumed resources
Summary of the consumed metrics of :
Mobile url : Platform Discharge, Process Data, Process Memory, Process CPU
PC url : Platform Energy
Mobile apk : Platform Discharge, Process Data, Process Memory, Process CPU
The difference between the last two versions of the metric is shown on the left of the value.
Meter
On the left menu, search and sort your test cases by Meter metrics. When you click on a test case you can check its detailed analysis.
Meter Metrics
Metrics are sorted by metrics associated to the entire platform and metrics associated to a process or a thread.
Site URL metrics
Composant | Metric | Unit | Description |
---|---|---|---|
Platform | Platform CPU | % | Average of CPU consumed by the platform |
Platform Memory | MB | Total of memory consumed by the platform | |
Platform Data | B | Total of network data exchange (in and out) by the platform | |
Platform Energy | Wh | Total of energy consumed by the platform | |
Platform Max Power | W | Maximum power consumed by the platform | |
Platform Discharge | Ah | Total of ampere hour consumed by the platform | |
Platform Discharge per second | Ah/s | Average ampere hour by second consumed by the platform | |
Process | Process CPU | % | Average of CPU consumed by the process |
Process Memory | MB | Average of memory consumed by the process | |
Process Data | B | Total of network data exchange (in and out) by the process | |
Process Energy | Wh | Total of energy consumed by the process | |
Process Discharge | Ah | Total of ampere hour consumed by the process |
APK metrics
Metric | Description |
---|---|
Activities | Number of current activities |
Application Contexts | Number of the application contexts |
Managers of in use assets | Number of resource managers used |
In use assets | Number of loaded into memory resources |
Death Recipients | Number of interface for receiving a callback when the process hosting has gone away. |
SSL Sockets opened | Number of open SSL sockets |
Loaded root views | Number of window or dialog box active for the current activity |
Loaded views | Number of component inheriting the View interface loaded into memory |
Rendered frames | Number of frames. This value is calculated by taking the difference of frames between the start and end of test. |
Rendered frames with slow loading | Number of frames putting more than 16ms to charge. This value is calculated by taking the difference of frames between the start and end of the test. |
Test Results
Budgets
Evolution
In the evolution tab you can compare on a graphic the average measures result by versions. On the test case list, you can check their detailed analysis between two versions.
Schedule a Test
Modules
Rules
For each tab (Android, CSS, HTML, Java and PHP) you can use the search tool in order to filter the rules by language. Each rule has its Energy Gain, Memory Gain, Performance and Difficulty.
Administration
Test times
License
Users
This option on the menu on the left lets you manage the current users, add a new one or delete a current one.