Asset Management
License Consumption and Asset States
Will an asset be included in license consumption if it is moved to a disposed or expired state?
No, Disposed or Expired assets do not consume license.
How to create an asset state similar to Expired or Disposed State?
It is not possible to create a state similar to Expired or Disposed. However, you can rename these states as per your convenience. You can also create a custom field to store sub state values.
Unique Identifiers for Assets
What are the unique identifiers for mobile devices added via CSV, MDM, and Intune sync?
Devices added via
| Unique identifiers
|
CSV import
| Asset Name or IMEI
|
ManageEngine MDM
| Serial Number & IMEI
(or)
Discovered Serial Number & IMEI
|
Intune Sync
| Serial Number & IMEI
(or)
Discovered Serial Number & IMEI
|
What are the unique identifiers for workstations?
While scanning assets via Asset Explorer Cloud,
- Assets will be added or updated based on the service tag value as a priority.
- If an asset with a matching service tag is not found, the asset will be added or updated based on the matching host name.
- If
the asset is not found with both service tag and host name
but has Mac address identification enabled, the asset will be added or
updated based on its Mac address.
What are the unique identifiers for printers, routers, and switches?
Printers, routers, and other devices which are scanned based on SNMP are added or updated based on the asset name.
Asset Import, Export and Errors
I am trying to import a file and facing the error "Product is not compatible with the product type". How can I resolve this?
- Check if the product provided for the asset in the spreadsheet is created under a different product type in the application.
- Update the existing product to the desired product type or update the product type column accordingly in the imported file.
For
e.g., if an asset is imported with product as "Dell 5000" and product
type as "Workstation", and if the same asset is imported next time with
product as "Dell 5000" and product type as "Switch", then this error
will be displayed.
How to export mobile devices or smartphone or tablet assets?
- Go to Reports > New Custom Reports.
- Select the type as Tabular Reports.
- Choose the module as Mobile Devices.
- Click Proceed to Report Wizard.
- Select the required columns in the Select columns to display section.
- Click Run Report.
You can export data using the Export as drop-down available on the top right corner.
Asset Categorization and Product Types
Where can I find the categorization of Laptops/Desktops in the asset module?
Currently, the categorization is available as a filter under Computers list
view. However, it will be deprecated soon. Therefore, users are advised
to create separate product types for laptops and desktops under
Computers and update the asset's product type accordingly.
Is it acceptable to create a separate product type for Laptops/Desktops?
Yes, you can create product types for Laptop/Desktop, but make sure that these product types are created as child to the Computer product type.
QR Codes and Barcodes
Is it possible to generate QR codes or barcodes in bulk for all assets?
Currently,
it is not possible to generate QR codes or barcodes for all assets in
bulk. However, you can bulk generate QR codes or barcodes for assets
based on the product. As a workaround, you can update QR codes or
barcodes in bulk using the import option.
How to generate asset tags or code for All Assets?
Please contact Asset Explorer Cloud support with the required format.
Asset Life Cycle and State Management
How can I apply the asset life cycle to existing assets?
You can assign a life cycle to an existing asset from the Actions menu in the asset list view. Ensure the current asset state is available within the asset life cycle.
How can I add fields to the Modify State window?
You can add a field in the Modify State window in assets via asset life cycle.
- Navigate to Setup > Automation >Life Cycles > Asset.
- Click New Life Cycle.
- Add a transition. In the During phase of transition, you can mandate the field as per your requirement.
You cannot add a field in the Modify State window in assets where a life cycle is not configured.
When a asset
life cycle is configured, the default behavior for the In Use state
will not occur automatically. You will need to manually
configure transitions within the life cycle settings to replicate system
behavior.
1. For Incoming Transitions to "In Use":
- You must configure the life cycle to make the User, Department,
or Associated Asset fields mandatory when transitioning the asset to
the In Use state.
2. For Transitions out of "In Use":
- You need to set up a Field Update action in the life cycle to clear
the User, Department, and Associated Asset fields (set them to None)
when transitioning out of the In Use state.
How to create a state that maintains User/Department information similar to the In Use state?
You can configure the asset life cycle and set the User/Department field as a mandatory field in the custom state transitions.
Asset Notifications and Acknowledgments
Users
are not receiving the acknowledgement link in the email when assets are
assigned to them even if asset assignment notification is enabled under
Notification Rules.
Add the $AcknowledgementLink variable in the Asset Assigned Notification Template under Setup > Automation > Notification Rules > Asset.
API and Data Retrieval
How to get all assets via API?
By default, the GET API of assets will provide only the first 10 assets.
Use
input data while using get list API as it details how and what data
needs to be fetched from the server. Refer to the links below to learn
more:
- Help guide for input data format
- Help guide for how to use input data in asset get list
You
can get a maximum of 100 assets at a time via API. You can use the
"start_index" and "row_count" keys in "list_info" to get more assets.
Refer to the below example for the "input_data" :
Eg: input_data = {"list_info":{"row_count":100,"start_index":1,"fields_required":["name","user","department","used_by_asset","product","state","asset_tag","purchase_cost","ip_address","mac_address","product_type","type","category","lifecycle","loan"]}}
Check
the response for "has_more_rows" key. If the value is True, then there
are more assets available in the server. In that case, use the row_count
and start_index as below:
input_data = {"list_info":{"row_count":100,"start_index":101,"fields_required":["name","user","department","used_by_asset","product","state","asset_tag","purchase_cost","ip_address","mac_address","product_type","type","category","lifecycle","loan"]}}
By changing the start_index to 101 you will get the next 100 assets.
To get the new "start_index" value, use the formula ("start_index"(previous) + "row_count").
kindly iterate till "has_more_rows" key value become false.
How to search assets based on field value?
Use
input data in the query param while invoking get list API as it details
how and what data needs to be fetched from the server.
Use search_criteria field to filter out the assets
search_criteria:
JSONARRAY | JSONOBJECT
The
field name and the value with specific conditions(Conditions that can
be given are "is, is not, lesser than, greater than, lesser or equal,
greater or equal, contains, not contains, starts with, ends with,
between") to be searched.
e.g {field:'project.id',condition:'is',value:project_id}
Example: For using them in POSTMAN, use input_data as the param key and below-given list_info as the value (encode the value)
{"list_info":{"row_count":10,"start_index":1,"fields_required":["name","user","department","used_by_asset","product","state"],"get_total_count":false,"search_criteria":[{"field":"name","condition":"like","value":"router","logical_operator":"and"}]}}
Replace
the "name" in the field key of search_criteria with the fields(eg :
serial_number , barcode) that you want to filter assets
Check the sample request area on how the input_data is used in all formats.
How to get required fields in the GET list of an asset?
By
default, there will be only few fields available in the asset GET list.
To fetch the required fields, use the "fields_required" key in
"list_info" object which can be sent to server via "input_data" URL
parameter.
fields_required (JSONARRAY)
Fields that are needed in the response can be given as
'fields_required'. JSON array must be given as input. Only the fields
that are given in fields required will be shown in response, along with
the identifier field (i.e) id. For example:
["asset_tag","serial_number"]
Example:
{"list_info":{"row_count":10,"start_index":1,"fields_required":["name","user","asset_tag","serial_number","product","state"],"get_total_count":false}}
Refer to the following help guide for:
- Usage of fields_required
- Sending the input_data to server
Currently,
we can't fetch certain hardware-related fields in GET list of
workstations where the values can contain multiple rows such as:
"processors","memory_modules","hard_disks","logical_drives","physical_drives","printer_details","mouse","video_card","ports","usb_controllers,"monitors","software","service_packs","network_domain","network_dns","network_adaptor"
Asset Scanning
Probe Installation and Setup
I see many probes added to my license. Should I add all of them to scan the entire set of assets?
Probes
are provided based on the number of assets in your pack. For every 50
assets, one probe will be provided. However, it is not necessary to use
and install all probes in order to scan all assets. If one probe can
reach and scan all assets, then you can install only one probe in your
network.
What are the prerequisites for adding a probe?
- Operating Systems: Windows Server 2012 or later, Windows 10 or later
- Dependent Software: .NET Framework 4.5 or later
- Hardware Requirements: Minimum 4GB of RAM, Processor Speed of 1.80 GHz and Hard Disk space of 5 GB
Scanning Requirements and Prerequisites
What are the prerequisites for scanning assets?
Windows Machines:
- Remote Registry service must be running in the target workstations.
- Port 139 (File Sharing) must be open on the target workstations.
- Admin$ share ( \\TargetMachine\admin$ share ) of the target workstation must be accessible from the probe machine.
Linux Machines:
- SSH port 22 must be open.
- Printers, Routers, Switches
- SNMP (v1 / v2c / v3) must be enabled and the corresponding credentials must have been configured for the network scan.
What is the default state of assets when they are initially added via a network scan?
New assets added to Asset Explorer Cloud via network scan will be in the In Store state by default.
Probe Issues and Troubleshooting
Why does the probe go inactive?
If
there is any connectivity problem between the probe-installed machine
and Asset Explorer Cloud, the probe may go down. Please go to the
machine where the probe is installed and try to open Asset Explorer
Cloud URL in a browser to see if it is reachable.
I don't think all my assets were scanned properly. Where can I find the probe logs?
You can go to the following logs path on the probe installed machine: \\ManageEngine\SDPODProbe\logs
I am seeing the message "No alive host found" in the last scan summary. What should I do?
From the probe-installed
machine, ping any of the target IPs in the network or domain range that
you are trying to scan. If the IP is not reachable, the machines in
the network or domain range cannot be scanned.
I am getting incorrect information from a scanned workstation. How can I verify the information?
Probe
uses the stable Microsoft queries to get the information from the
registry, so the information is expected to be accurate. However, there
could be some difference between various scan sources, such as probe
scan and other integrations. If you find any incorrect information,
please reach out to our support.
Self-Scan and Remote Scanning
How can I scan assets that are not connected to our network or domain? For example, to scan assets of remote users.
Is there a way to trigger the self-scan script automatically rather than manually sharing it with all users?
Yes,
it is possible to trigger self-scan automatically. While executing
self-scan via PowerShell, it will assign a task to the task scheduler,
and the task will run the self-scan at the scheduled time daily.
I am performing a self-scan and facing "Error 401" in the command prompt. What could be causing this?
In most cases, this error is displayed if the API key or the server that self scan is pointing to is different or incorrect.
- Verify if the API key is copied correctly from the application.
- Check
whether the server URL of your application is pointing to
sdpondemand.manageengnine.com by default. If not, change the server URL
by using the extra parameter server=<your application url> for Windows and serverUrl=application URL for Linux & Mac machines.
- Use the server URL as per your signed-up data center.
Asset Assignment and Auto-Assign
How can I automatically assign assets to users?
Assets can be automatically assigned to users using the asset auto assign feature.
- Go to Setup > Automation > Asset Auto Assign.
- Enable Asset Auto Assign.
- Configure the number of consecutive times for assignment and select the user additional field where the AD Login Name is stored.
If
you are using on-premises AD, create an additional field for the user
to store the AD Login Name. Populate the AD Login Name for all users who
use the provisioning tool, by using "ad_loginname" in the provisioning
tool's attribute mapping page.
If you are using the Azure AD, map the AD_LOGINNAME field.
Common Scan Errors and Fixes
What does error code 12023 mean?
During the scan, the probe will copy SDPOD_MiniAgent.exe and ae_scan.vbs to the admin$ shared path of the target Windows machines and invoke a service with the name SDPOD_MiniAgent.
This service will execute the ae_scan.vbs file, scan
the machine, and generate the file containing inventory data. The
generated file will be copied by the probe machine. Once the scan
process is done, the probe will uninstall the service and delete the
files that were copied to the end machine.
If the service is not invoked, then error 12023 would be thrown.
Usually, firewall/antivirus
will restrict this operation and consider the connection
from probe as a threat. Therefore, disable the antivirus/firewall in any
one of the target machines where this error occurs and then run
the scan. If the scan works, configure the probe-installed machine
IP address as safe in the firewall/antivirus so the scan would work
properly.
How can I troubleshoot the error where port 139/22 is not open?
-
Go to Probe machine
-
Open a command prompt
-
cd C:\ManageEngine\SDPODProbe\bin\NMap
-
nmap.exe -n -p 135,139,445,22 Host_IP (e.g., nmap.exe -n -p 135,139,445,80 192.168.1.1)
-
The command displays whether the target host is up and the list of open ports in that workstation.
How can I troubleshoot the error indicating that the remote registry is not enabled?
If the remote registry is not enabled,
- Go to the target machine for which the error is coming.
- Click Start and search for Services.
- In the Services dialog box, search for Remote Registry.
- Right click on Remote Registry and click Start.
Advanced Scanning and Customization
What attributes can be fetched during a probe scan?
Are there any special considerations before scanning a virtual host or machine?
For VM
Hosts there are two types of devices, VMWare and Windows using Hyper V.
To scan a virtual machine hosted in VMWare, you need to provide the
credentials under VMWare type and use that credential
during network scan.
For
Windows Hyper V, you can provide the credentials for Windows type and
use those credentials in network scan. During Domain Scan, the Windows
machine connected to the domain must meet the prerequisites
for a Windows machine.
Remote Access and Connectivity
Is it possible to take unattended remote access of a workstation from Asset Explorer Cloud?
Yes, it is possible to take Unattended Remote Access of a workstation from Asset Explorer Cloud. To do so,
- Update the credentials for that asset using the Actions menu in the asset list view.
- Ensure all the prerequisites mentioned for scan are met for taking the remote control.
Endpoint Central Integration
Asset Assignment and Asset Remote Add-on
Will assets added from Endpoint Central to Asset Explorer Cloud have users assigned? If not, what steps should be taken?
Assets can be automatically assigned to users who use the asset auto assign feature.
1. Go to Setup > Automation > Asset Auto Assign.
2. Enable Asset Auto Assign.
3. Configure the number of consecutive times for assignment, select the user additional field where the AD Login Name is stored.
If
you are using on-premise AD, create an additional field for the user to
store the AD Login Name. Populate the AD Login Name for all users using
provisioning tool, by using "ad_loginname" in the Provisioning tool's
attribute mapping page. If you are using the Azure AD, map the
AD_LOGINNAME field.f you are using the Azure AD, map the AD_LOGINNAME
field.
Will the asset remote add-on work if my account is integrated with Endpoint Central?
Currently,
we do not support remote control via Endpoint Central integration.
However, we are working on a tight integration with Endpoint Central
that will enable users to take remote control via Endpoint Central
agents.
Scanning Assets
We
already have Endpoint Central. Should I still use the probe to scan
assets, or can I integrate it with Asset Explorer Cloud?
If you are already using Endpoint Central, please integrate Endpoint Central with Asset Explorer Cloud to scan assets.
Troubleshooting Asset Sync Issues
Assets scanned via EPC are not displayed in the application. How do I debug this?
- Do a global search for All Assets with the service tag and check if the asset is found.
- If two assets have the same service tag, the asset will be over-written. To avoid this add the duplicate service tag in the Setup > Probes & Discovery > Settings > General > Invalid Service tag list and save it.
- Check if the Mac Address identification is enabled under Setup > Probes & Discovery > Settings > General > Enable/Disable MAC address identification during scan.
If enabled, disable the mac address identification and save it. There
is a possibility that two different assets might have the same mac
address at different times, which will result in asset over-writing.
- Custom
Product Type - Only Computer and its child product types will be synced
from EndpointCentral (Agent). Ensure if the asset's model (Device Model
in EC) is under the computer product type or any of its child from Setup > Customization > Asset Management > Product.
Search for the particular product, edit it and change the parent
product type to Computer or any of its child product types, and save it.
- Wait
for the next schedule scan to complete after performing the steps 2,3
or 4. The asset will get added to the Asset Explorer during the successive scans.
Software
Software Licensing and User Management
How to allocate a license to users?
Licenses can be assigned to either users or workstations.
To assign a license to a user,
- Use Concurrent License type or Named User License type.
- Concurrent licenses enable us to assign multiple users to a single license
- Named User Licenses are individual licenses.
- Users can be linked to a license during or after it is created using the UsedBy field or table.
- Customers can also create similar custom license types with a Track By value of user from Setup > Customization > Asset Management > Software License Types > Asset Management > Software License Types.
How to import users from OKTA using Provisioning Tool?
Importing
people from OKTA is still in the beta phase of the development.
Currently, you can follow the steps below to import users:
Sync data from OKTA to ServiceDesk Plus Cloud before configuring the provisioning tool:
- Close the provisioning tool application if it's open.
- Navigate to C:\Users\<User>\ZohoProvisioning\ and edit the provisioning.conf file.
- Add the configuration ldap_version=2 to the end of the file. [Note: Modifying other entries in this file may result in error when using the provisioning tool.]
- Save and close the file.
- Restart the provisioning tool.
To
configure the provisioning tool with OKTA LDAP configurations, enable
LDAP Interface in OKTA Directory Integrations before starting the
provisioning tool as explained below:
- Ensure LDAP Interface is enabled in OKTA Directory settings.
- Copy the OKTA Host available in the LDAP Interface details and add the LDAP Server as ldaps://<host>:636 in the provisioning tool. For example: ldaps://trial-123.ldap.okta.com:636.
- Set Use SSL as true.
- Copy the Base DN available in LDAP Interface details and paste in the provisioning tool.
- For
Authorized User, copy the User Base DN available in the LDAP Interface
Details. It is mandatory that the Login User details should be appended
before the User Base DN.
- For
example, If the User Base DN is dc=trial-123456,dc=okta,dc=com and if
the Username of the authorized user or super admin is adminuser@xyz.com,
then the Authorized User has to be in the format "uid=<username>,
UserBaseDN".
- Example : uid=adminuser,dc=trial-123456,dc=okta,dc=com
- The Authorized User password must be set as the password.
- Proceed to the next step and connection to OKTA LDAP should be successful.
API Syntax and Description
How to set up Intranet API Integration using probe and webhooks?
We
have an existing feature where we use probe and custom function to make
Intranet API calls. We have also provided this API to custom widgets
and made ADMP integration possible.
Users can also set up Intranet API Integration using webhooks, as explained below:
This is a private add-on and it is not enabled by default. Customers can contact support to enable the add-on .
- Go to Setup > Automation > Custom Action > Webhooks.
- Fill
in the details related to the Intranet APIs. Unlike the usual webhooks,
you can use local URLs and custom port number to the URLs.
Select the Endpoint Type as Intranet (This field will be visible only if the add-on is enabled).
- Select the probe that will receive the API Request from the ServiceDesk Plus Cloud and place a call to the target server.
- If
the callback custom function option is selected, it will receive the
response from the probe. This response can be used to perform further
actions if needed.
You
can use this webhook in workflows, life cycles, triggers, and other
custom actions to execute any Intranet action. When the action
conditions are met, the webhooks will be executed automatically on the
target server.
How the feature works:
How to automatically create user in AD using probe and webhooks?
You
can automatically create a user in Active Directory using ManageEngine
ADManager Plus when a request is raised with template User Creation and
the request status changes to To Create User In AD.
Step 1: Create a webhook
- Install a probe in the same network as ADManager Plus.
- Create a webhook with ADManager Plus URL and other necessary parameters using the available request $ variables.
- Choose the Endpoint Type as Intranet.
- Select the probe which is in same network as ADManager Plus server.
- Choose a custom function (if already available) to process the request sent by ADManager Plus server.
- Click Save.
Step 2: Create a request life cycle
- Create a request life cycle and associate the template User Creation.
- Add To Create User In AD status to the request life cycle.
- In the After transition of the status, add a custom action and select the webhook configured above.
- Click Save.
How to update the status of the CI when its updated in CMDB using probe and webhooks?
When a CMDB record is edited, you can trigger an Intranet API Call to fetch server details.
Create Build API: localserver-build:8080/api/server_details
- Create a CMDB trigger using When created option.
- Set relevant conditions.
- Under custom action, choose Webhook.
- Create a webhook with the necessary details.
- Choose the Endpoint Type as Intranet.
- Select the desired probe and also the custom function, which is used to check the running status from the response.
- Save the trigger.
Intranet APIs using Probe - Syntax and Sample Code
The following INPUT_DATA JSON should be constructed for APIs triggered via custom functions in triggers, life cycle, etc.
API Input Format/Syntax:
- {
"action": "REST",
"callback_type" : "default",
"callback_param" : "param for type",
"probe" : {
"id" : "100001234567890"
},
"action_params": {
"method": "GET",
"path": "http://zylker.com/api/abc",
"headers": {
"some_key": "some_value"
},
"query_parameters": {
"some_key": "some_value"
},
"raw_content": "any content here"
},
"post_action_params": {
"any_key": "any_value"
}
}
API Syntax and Description:
Key
|
Description
|
action
|
Type of Action (supported : REST)
|
callback_type (if none provided, "default")
|
"default" / "custom_function"
how the response needs to be delivered.
"default" will deliver the response to client,
"custom_function" : Responses will be delivered to a specific callback custom function.
|
callback_params
|
Any params that needs to be delivered for callback_type
*mandatory for type "custom_function".
|
action_params
|
Actual parameters related to request.
|
probe (JSONObject)
|
Probe in which the command should be executed. This should have a Probe ID.
|
method
|
REST method (get,post,put,delete)
|
path
|
The API URL which is needed to be executed.
|
headers
|
Any Headers that need to be sent. (optional)
|
query_parameters
|
Any Query Params that need to be sent. (optional)
|
raw_content
|
Any string content that needs to be sent. (optional)
|
post_action_params
|
Any data sent in this will be sent back (depending on callback_type) along with a response from the “path” API. |
Sample Callback Custom Function Code:
Parameters should be of "Form Data" type with "response" as "Map" type parameter.
1 probe_command = response.get("probe_command");
2 probe_command_id = probe_command.get("id").toString();
3 //if post action params needed
4 post_action_params = response.get("post_action_params");
5 //getting result string from probe_command
6 thirdparty_response = probe_command.get("result");
7 //This thirdparty_response_as_map = response from third-party server
8 thirdparty_response_as_map = Map(result);
9
10 // Performing any actions based on response received
11 return null;
Sample Custom Function Code for invoking Intranet API using probe via Custom Function instead of Webhook:
- //Dummy ThirdParty Headers
thirdPartyHeaders = Map();
thirdPartyHeaders.put("Authorization","xxxxxxxxxxxxxxxxxxxxxxxxx");
thirdPartyHeaders.put("Accept","vnd.manageengine.com.v3+json");
//Dummy ThirdParty QueryParameters
thirdPartyQueryParameters = Map();
//Dummy ThirdParty RawContent
thirdPartyRawContent = "some raw content";
// constructing JSON
params = Map();
probe = Map();
probe.put("id","100009876543210");
params.put("probe",probe);
params.put("action","REST");
action_params = Map();
action_params.put("method","get");
action_params.put("headers",thirdPartyHeaders);
action_params.put("query_parameters",thirdPartyQueryParameters);
action_params.put("raw_content",thirdPartyRawContent);
action_params.put("path","https://zylker.com/api/dummyapi");
params.put("action_params",action_params);
post_action_params = Map();
//you can send any data from the request object and get the data back in
callback CF. For example we are sending request ID in post action
params
post_action_params.put("request_id",requestObj.get("id"));
params.put("post_action_params",post_action_params);
input_data = Map();
input_data.put("input_data",params);
//info input_data;
//Invoke ProbeCommandsApi
result = zoho.sdp.invokeurl
[
url :"/api/v3/probe_actions/_command_request"
type :POST
parameters:input_data
];
info result;