This example illustrates various parameters that can be adjusted when using the on-premise device detection engine, and controls when a new data file is sought and when it is loaded by the device detection software. Three main aspects are demonstrated:
- Update on Start-Up
- Filesystem Watcher
- Daily auto-update
License Key
In order to test this example you will need a 51Degrees Enterprise license which can be purchased from our pricing page. Look for our "Bigger" or "Biggest" options.
Data Files
You can find out more about data files, licenses etc. at our FAQ page
Enterprise Data File
Enterprise (fully-featured) data files are typically released by 51Degrees four days a week (Mon-Thu) and on-premise deployments can fetch and download those files automatically. Equally, customers may choose to download the files themselves and move them into place to be detected by the 51Degrees filesystem watcher.
Manual Download
If you prefer to download files yourself, you may do so here:
1 https://distributor.51degrees.com/api/v2/download?LicenseKeys=<your_license_key>&Type=27&Download=
True&Product=22
Lite Data File
Lite data files (free-to-use, limited capabilities, no license key required) are created roughly once a month and cannot be updated using auto-update, they may be downloaded from Github and are included with source distributions of this software.
Update on Start-Up
You can configure the pipeline builder to download an Enterprise data file on start-up.
Pre-Requisites
- a license key
- a file location for the download
- this may be an existing file - which will be overwritten
- or if it does not exist must end in ".hash" and must be in an existing directory
Configuration
- the pipeline must be configured to use a temp file
1 create_temp_copy =
True,
- a DataFileUpdateService must be supplied
1 update_event = UpdateEvent()
2 update_service = DataFileUpdateService()
3 update_service.on_complete(
lambda status, file: update_event.set(status))
5 data_file_update_service = update_service,
- update on start-up must be specified, which will cause pipeline creation to block until a file is downloaded
1 update_on_start =
True,
File System Watcher
You can configure the pipeline builder to watch for changes to the currently loaded device detection data file, and to replace the file currently in use with the new one. This is useful, for example, if you wish to download and update the device detection file "manually" - i.e. you would download it then drop it into place with the same path as the currently loaded file. That location is checked periodically (by default every 30 mins) and this frequency can be configured.
Pre-Requisites
- a license key
- the file location of the existing file
Configuration
- the pipeline must be configured to use a temp file
1 create_temp_copy =
True,
- a DataFileUpdateService must be supplied
1 update_event = UpdateEvent()
2 update_service = DataFileUpdateService()
3 update_service.on_complete(
lambda status, file: update_event.set(status))
5 data_file_update_service = update_service,
- configure the frequency with which the location is checked, in seconds (10 mins as shown)
1 polling_interval = (10*60),
Daily auto-update
Enterprise data files are usually created four times a week. Each data file contains a date for when the next data file is expected. You can configure the pipeline so that it starts looking for a newer data file after that time, by connecting to the 51Degrees distributor to see if an update is available. If one is, then it is downloaded and will replace the existing device detection file, which is currently in use.
Pre-Requisites
- a license key
- the file location of the existing file
Configuration
- the pipeline must be configured to use a temp file
1 create_temp_copy =
True,
- a DataFileUpdateService must be supplied
1 update_event = UpdateEvent()
2 update_service = DataFileUpdateService()
3 update_service.on_complete(
lambda status, file: update_event.set(status))
5 data_file_update_service = update_service,
- Set the frequency in seconds that the pipeline should check for updates to data files. A recommended polling interval in a production environment is around 30 minutes.
1 polling_interval = (10*60),
- Set the max amount of time in seconds that should be added to the polling interval. This is useful in datacenter applications where multiple instances may be polling for updates at the same time. A recommended ammount in production environments is 600 seconds.
1 update_time_maximum_randomisation = (10*60),
Location
This example is available in full on GitHub.
This example requires a subscription to 51Degrees Device Data, a subscription can be acquired
from the 51Degrees pricing page.
Required PyPi Dependencies:
163 from datetime
import datetime
168 from fiftyone_pipeline_core.logger
import Logger
170 from fiftyone_pipeline_engines.datafile_update_service
import DataFileUpdateService
171 from fiftyone_pipeline_engines.datafile_update_service
import UpdateStatus
173 UPDATE_EXAMPLE_LICENSE_KEY_NAME =
"license_key" 174 DEFAULT_DATA_FILENAME = os.path.expanduser(
"~") + os.path.sep + ENTERPRISE_DATAFILE_NAME
177 class UpdateEvent(threading.Event):
182 def set(self, status):
191 class DataFileUpdateConsole:
192 def run(self, data_file, license_key, interactive, logger, output):
193 logger.log(
"info",
"Starting example")
197 license_key = KeyUtils.get_named_key(UPDATE_EXAMPLE_LICENSE_KEY_NAME)
201 "In order to test this example you will need a 51Degrees Enterprise " 202 "license which can be obtained on a trial basis or purchased from our\n" 203 "pricing page http://51degrees.com/pricing. You must supply the license " 204 "key as an argument to this program, or as an environment or system variable " 205 f
"named '{UPDATE_EXAMPLE_LICENSE_KEY_NAME}'")
206 raise Exception(
"No license key available")
209 if (data_file !=
None):
211 data_file = ExampleUtils.find_file(data_file)
213 if (os.path.exists(os.path.dirname(data_file)) ==
False):
215 "The directory must exist when specifying a location for a new " 216 f
"file to be downloaded. Path specified was '{data_file}'")
217 raise Exception(
"Directory for new file must exist")
218 logger.log(
"warning",
219 f
"File {data_file} not found, a file will be downloaded to that location on " 223 if (data_file ==
None):
224 data_file = os.path.realpath(DEFAULT_DATA_FILENAME)
225 logger.log(
"warning",
226 f
"No filename specified. Using default '{data_file}' which will be downloaded to " 227 "that location on start-up, if it does not exist already")
229 copy_data_file_name = data_file +
".bak" 230 if (os.path.exists(data_file)):
232 pipeline = DeviceDetectionPipelineBuilder(
233 data_file_path = data_file,
234 performance_profile =
"LowMemory",
235 usage_sharing =
False,
237 licence_keys =
"").add_logger(logger).
build()
240 ExampleUtils.check_data_file(pipeline, logger)
241 if (ExampleUtils.get_data_file_tier(pipeline.get_element(
"device")) ==
"Lite"):
243 "Will not download an 'Enterprise' data file over the top of " 244 "a 'Lite' data file, please supply another location.")
245 raise Exception(
"File supplied has wrong data tier")
246 logger.log(
"info",
"Existing data file will be replaced with downloaded data file")
247 logger.log(
"info", f
"Existing data file will be copied to {copy_data_file_name}")
251 output(
"Please note - this example will use available downloads " 252 "in your licensed allocation.")
253 user_input = input(
"Do you wish to continue with this example (y)? ")
254 if (user_input ==
None or user_input ==
"" or user_input.startswith(
"y") ==
False):
255 logger.log(
"info",
"Stopping example without download")
258 logger.log(
"info",
"Checking file exists")
259 if os.path.exists(data_file):
260 logger.log(
"info", f
"Existing data file copied to {copy_data_file_name}")
261 shutil.copy(data_file, copy_data_file_name)
264 "Creating pipeline and initiating update on start-up - please wait for that " 267 update_event = UpdateEvent()
268 update_service = DataFileUpdateService()
269 update_service.on_complete(
lambda status, file: update_event.set(status))
273 pipeline = DeviceDetectionPipelineBuilder(
279 data_file_path = data_file,
280 create_temp_copy =
True,
283 data_file_update_service = update_service,
286 licence_keys = license_key,
291 update_on_start =
True,
296 file_system_watcher=
True,
297 data_update_product_type=
"V4TAC",
298 ).add_logger(logger).
build()
303 output(f
"Update on start-up complete - status - {update_event.status}")
305 if update_event.status == UpdateStatus.AUTO_UPDATE_SUCCESS:
307 output(
"Modifying downloaded file to trigger reload - please wait for that" 316 now = datetime.now().timestamp()
318 os.utime(data_file, (now, now))
320 raise Exception(
"Could not modify file time, abandoning " 323 if update_event.wait(120):
324 output(f
"Update on file modification complete, status: {update_event.status}")
326 output(
"Update on file modification timed out")
328 logger.log(
"error",
"Auto update was not successful, abandoning example")
329 error_message = f
"Auto update failed: {update_event.status}" 330 if update_event.status == UpdateStatus.AUTO_UPDATE_ERR_429_TOO_MANY_ATTEMPTS:
331 output(error_message)
333 raise Exception(error_message)
335 output(
"Finished Example")
341 license_key = argv[0]
if len(argv) > 0
else None 342 data_file = argv[1]
if len(argv) > 1
else None 345 logger = Logger(min_level=
"info")
347 DataFileUpdateConsole().
run(data_file, license_key,
True, logger,
print)
350 if __name__ ==
"__main__":