Mini Fixes
- the apikey can also be handed over in the header in the API methods
- Booking: a booking cleanup time was added
Reporting changes
LOGBOOK REPORTING
- A staff number was added to the fringebenefit reports
- A new logbook report grouped by drives was added
GFS WORKSHOP APP
A new workshop module was added, the modul must be activated on the account as SU
The workshop app can be activated for operators, managers or users.
If configure, the workshop app module can be used in the new software
The following functions can be applied
- set an imei (standard case)
- Assign this asset to a tag (asset key stays it is, asset imei gets format - tag::codeinfo, the tag is configured to be assiged and the name of the tag is set to the asset name)
- important settings an be configured
- The asset service section can be filled out
- if there is already logbookdata, the odometer and operating hours value can be calibrated
Form Data
A generic form module was added.
- The fsorm can be defined in the Form table
- Forms must be assigned to accounts in the DB
- Forms can we triggered in the Workshop module
- Formdata be be listed and requested in the workshop module.
AEMP FEATURES
Feature | Description |
---|---|
AEMP Feature | The "Machine Data module was created to show AEMP data |
AEMP ASSET ALARM | if configured on the account, e-mails are triggerd to the customer in the case of new created asset based on aemp data |
AEMP geofencing | Geofences processing was added to the report and the export (geiger) |
Booking FIXES
- RAG: if there key safe configured, bookings are started and stopped automatically
- booking report was improved, KM fetch from the logbook table based on the booking id and costcenter section was added. if costcenters contain , or % seperator char, the km values are device by the trips booking on multiple cost centers.
- Bookings: if the site does not have a key safe, bookings are started and stopped automatically. There are no additionally alarms yet
- Calendar entries are added to the e-mail with accept buttons (because the publish only mode does not work well) only, if its enabled on the account and a valid cal email entry
- The private flag in the booking process was made configureable in the booking options
The earlierst booking start and latest return time per days was made configureable on the booking site level
VIP TESTS (Very important this time)
- Asset
- EDIT / UPDATE / DEL asset
- Save and edit asset restrictions
- TEST all asset restrictions with data flow
- Duplicate
- Geofence
- Creation
- Updates
- Import and Export
- Processing with set filters
- Save and edit geofence restriction
- Geo
Habau Triggered Changes
Data Protection Config on asset level
On the asset Level its possible to configure a new mode, that requires a reason to be entered to see the data of an asset. Only admins are allowed to change the data protection setting on the asset level.
if the asset a assiged a cust cat that also has an higher level than the asset, the asset value is overwritten.
Cust CAT | ASSET |
---|---|
On the account its possible to activated or deactivate this asset based data protection.
Infos:
- Admins can see all data
- if you login as e.g. manager with SU pass, the reistrictions apply
- Assets like this are not displayed on the map
- if a manager is doing reports for all assets e.g. assets with a data protection mode set are not added to the report
- Managers, users.. can request access by using the release form, for a configured time the asset can be accessed after this operation
- the data is is also stored in the corresponding table
- A data protection reports shows the access assets
Worktime Detection
the worktime detection (usedin the habau interfaces) was extended to be able to use public holidays, worktimes per day and plant holiddays. The calculation is used in the logbook stream export interface.
There can be customer based and system bases holiday calendars and 2 plant holiday configs per account.
The holiday calendars are edited in the database directly
Restrictions for assets and geofences
the asset restrictions were improved.
Steaming Logbook API
Allows to stream data of the logbook in a habau expected format.
Every call must set the last successfully processed streamlogid.
Parallel stream can also be done using the caller_stream_ref param.
To start from any point in time use an exisiting old streamlog id.
To start from the beginning use treamlog id 0.
opt_cust cat can be e.g. the configure value on the asset level (e.g. NFZ, BF, ....)
Tree View | JSON RESULT | JSON RESULT |
---|---|---|