DataHub delivers a user-friendly web-based experience that is optimized for PC, tablet and mobile phone interfaces—so you can monitor and control your file transfers anywhere, from any device.
DataHub’s true bi-directional hybrid/sync capabilities enable organizations to leverage and preserve content across on-premises systems and any cloud service. Seamless to users, new files/file changes from either system are automatically reflected in the other.
DataHub uses jobs to perform specific actions between the source and destination platforms. The most common type of jobs are copy and sync; please see Create New Job | Transfer Direction for more information
All jobs can be configured to run manually or on a defined schedule. This option will be presented as the last configuration step.
To create a job, select the Jobs option from the left menu, and click on Create Job. DataHub will lead you through a wizard to select all the applicable options for your scenario.
The main job creation steps include:
- Selecting a Job Type
- Configuring Locations
- Defining Transfer Policies
- Defining Job Transfer Behaviors
- Advanced Options
- Summary | Review, Create Job, and Schedule
Job Type
Job type defines the kind of job and the actions the job will perform with the content. There are two main job types available; basic transfer and folder Mapping.
Basic Transfer
Transfer items between one connection and another
This will copy all content (files, folders) from the source to the destination. Each Job run will detect any new content on the Source and copy to the Destination
For more information, please see Create New Job | Transfer Direction
Folder Mapping
Identify a source folder on one connection and a child job will be generated for each folder the source contains
Folder mapping jobs are ideal for migrations where you wish to control the transfer at a granular level without the effort of creating individual jobs. DataHub will automatically create a unique job for each folder in your hierarchy; inheriting configurations from the parent / master job.
The folder mapping job may be controlled and manipulated just like a transfer job but when executed it will not transfer data. Instead, each execution creates, modifies, or deletes its child jobs which are then responsible for the transfer of data.
For more information, please see Create New Job | Transfer Direction
User Account Mapping
Map user accounts on one connection to those on another. Individual jobs will be generated to transfer items between mapped accounts
User account mapping jobs are ideal for migrations where you wish to control the transfer at a granular level without the effort of creating individual jobs. DataHub will automatically create a unique job for each user account that can be matched between the source and destination connections.
Important! Network home drive mapping jobs require connections with the ability to impersonate other users.
For more information, please see Create New Job | Transfer Direction
Network Home Drive Mapping
Map Network home drives to users on another connection. Individual jobs will be generated to transfer items between the source folder and destination account
Network home drive mapping jobs are ideal for migrations where you wish to control the transfer at a granular level without the effort of creating individual jobs. DataHub will automatically create a unique job for each user account that can be matched between the source and destination connections.
Important! Network home drive mapping jobs require connections with the ability to impersonate other users.
For more information, please see Create New Job | Transfer Direction
Basic Transfer - Define Source & Destination Locations
All platform connections made in the DataHub Platform application will be available in the locations drop down lists when creating a job.
- If your connections were created with Administrative privileges, you may also have the ability to impersonate another user within your organization
- Source defines the location of your current content you wish to transfer
- Destination defines the location of where you would like your content to go
Folder Mapping - Define your Source and Destination Paths
If you are an administrator using Impersonation, enable Run as user... and choose the user you wish to access
All job features defined while creating the parent job, will be applied to the child jobs it creates
Source / Destination Path: If you wish to transfer all content, leave the source path blank. A child job will be created for every top level folder. If a folder is selected for the source path, a child job will be created for every sub folder within the parent folder.
Child Job Source / Destination Path: This directory within each folder will be used as the source.
Target the root of each folder: The child job will be created for the first level folder relative to the source path
Target a specific directory within each folder: If there is a folder that is exists in every directory, you can define it with this option
User Account Mapping - Define your Source and Destination Paths
If you are an administrator using Impersonation, enable Run as user... and choose the user you wish to access
Source / Destination Path: If you wish to transfer all content, leave the source path blank. A child job will be created for every user account. If a director is selected for the source path, a child job will be created for every user account within the parent directory.
Child Job Source / Destination Path: This directory within each user account will be used as the source. If the directory does not exist on the destination, DataHub will create it.
Target the root of each user account: The child job will be created for the first level folder/account relative to the source path
Target a specific directory within each user account: If there is a folder that is exists in every user account, you can define it with this option
Network Home Drive Mapping - Define your Source and Destination Paths
If you are an administrator using Impersonation, enable Run as user... and choose the user you wish to access
Source / Destination Path: If you wish to transfer all content, leave the source path blank. A child job will be created for every home drive. If a directory is selected for the source path, a child job will be created for every home drive within the parent directory.
Child Job Source / Destination Path: This directory within each folder will be used as the source. If the directory does not exist on the destination, DataHub will create it.
Target the root of each folder: The child job will be created for the first level folder/account relative to the source path
Target a specific directory within each folder: If there is a folder that is exists in every user account, you can define it with this option
Configuring Your Locations - Impersonation
Impersonation allows a site admin access to all the folders on the site, including those that belong to other users. With DataHub, a job can be setup using the username and password of the site admin to sync / migrate / copy files to or from a different user's account without ever having the username or password of that user.
How and why would I use impersonation?
This allows a site admin access to all the folders on the site, including those that belong to other users. Within DataHub, a job can be setup using the username and password of the site admin to sync / migrate / copy files to or from a different user's account without ever having the username or password of that user.
Enable Run as user...
Choose Source User
Job Category
The category function allows for the logical grouping of jobs for reporting and filtering purposes. The category is optional and does not alter the job function in any way.
DataHub comes with two default job categories:
Maintenance: DataHub maintenance jobs only. This category allows you to view the report of background maintenance jobs and is not intended for newly created transfer jobs
Default: When a category is not defined during job creation, it will automatically be given the default category. This option allows you to create a report for all jobs that a custom category was not assigned
Create Job Category
Enable feature and select from existing job categories or create a new category
From the jobs grid, filter by category
Job Policies
Define what should happen once items have been successfully transferred and set up rules around how to deal with content as it is updated on your resources while the job is running.
- DataHub works on the concept of “deltas” where the transfer engine only transfers files after they have been updated
- File version conflicts occur when the same file on the source and destination platforms have been updated in between job executions
- Policies define how DataHub handles file version conflicts and whether or not it persists a detected file deletion.
- Each job has its own policies defined and the settings are NOT global across all jobs
Conflict Policy - File Version Conflicts
When a conflict is detected on either the source or the destination, Conflict Policy determines how DataHub will behave
For more information, please see: Conflict Policy
Delete Policy - Deleted Items
When a delete is detected on either the Source or the Destination, Delete Policy determines how DataHub will behave.
For more information, please see: Delete Policy
Behaviors
Determine how this job should execute, and what course of action to take in different scenarios. All behaviors are enabled by default; as recommended settings to ensure content is transferred successfully to the destination.
Zip Unsupported Files / Restricted Content
Enabling this behavior allows DataHub to compress any file that is not supported on the destination into a .zip format before being transferred. This will be done instead of flagging the item for manual remediation and halting the transfer of the file.
For example, if you attempt to transfer the file "db123.cmd" from a Network File Share to Sharepoint, DataHub will compress the file to "db123.zip" before transferring it over, and thus avoiding an error message.
Allow unsupported file names to be changed
Segment Transformation policy controls if DataHub can change folder and file names to comply to platform's restrictions.
Enabling this behavior allows DataHub to change the names of folders and files that contain characters that are not supported by the destination before transferring the file. This will be done instead of flagging the file for manual remediation and preventing it from being transferred.
When this occurs the unsupported character will be transformed into a underscore.
For example, if you attempted to transfer the file "Congrats!.txt" from box to NFS it would be transformed to "Congrats_.txt" and appear that way on the destination.
Preserve file versioning between locations
DataHub will preserve and transfer all versions of a file on supported platforms
Define Job Schedule
During job creation, the final step is to define when the job will run and what criteria will define when it stops.
- Save job will launch the job scheduler
- Save job and run it right now will trigger the job to start immediately. It will run every 15 mins after the last execution completes
Schedule Stop Policies
Stop policies determine when a job should stop running. If none of the stop policies are enabled, a scheduled job will continue to run until it is manually stopped or removed.
The options for the stop policy are:
Stop after a number of total runs
The number of total executions before the job will move to "complete" status
Stop after a number of runs with no changes
The job has run and detected no further changes; all content has transferred successfully.
If new content is added to the source and the job runs again, this will not increment your stop policy count. However, job executions that detect no changes do not need to be consecutive to increment your stop policy count.
Stop after a number of failures
Most failures are resolved through automatic retries. If the retries fail to resolve the failures, then manual intervention is required. This policy takes the job out of rotation so that the issue can be investigated.
Job executions that detect failures do not need to be consecutive to increment your stop policy count.
Stop after a specific date
The job will "complete" on the date defined
Job Summary - Review your job configuration
Before you create your job, review all your configurations and adjust as needed. Modifying your job after creation is not supported; however, the option to duplicate your current job will allow you to make any adjustments without starting from the beginning.
- The Edit option will take you directly to the configuration to make changes
Doc rebrand check list
- Change SkySync to DataHub, except in code examples and literals.
- Verify hyperlinks and change or delete as necessary.
- Check graphics. Delete or replace as necessary. If deleted, make sure text doesn't reference the graphic.
- SME reviewed.
- SME approved.
DataHub delivers a user-friendly web-based experience that is optimized for PC, tablet and mobile phone interfaces—so you can monitor and control your file transfers anywhere, from any device.
DataHub’s true bi-directional hybrid/sync capabilities enable organizations to leverage and preserve content across on-premises systems and any cloud service. Seamless to users, new files/file changes from either system are automatically reflected in the other.
DataHub uses jobs to perform specific actions between the source and destination platforms. The most common type of jobs are copy and sync; please see Create New Job | Transfer Direction for more information
All jobs can be configured to run manually or on a defined schedule. This option will be presented as the last configuration step.
To create a job, select the Jobs option from the left menu, and click on Create Job. DataHub will lead you through a wizard to select all the applicable options for your scenario.
The main job creation steps include:
- Selecting a Job Type
- Configuring Locations
- Defining Transfer Policies
- Defining Job Transfer Behaviors
- Advanced Options
- Summary | Review, Create Job, and Schedule
Job Type
Job type defines the kind of job and the actions the job will perform with the content. There are two main job types available; basic transfer and folder Mapping.
Basic Transfer
Transfer items between one connection and another
This will copy all content (files, folders) from the source to the destination. Each Job run will detect any new content on the Source and copy to the Destination
For more information, please see Create New Job | Transfer Direction
Folder Mapping
Identify a source folder on one connection and a child job will be generated for each folder the source contains
Folder mapping jobs are ideal for migrations where you wish to control the transfer at a granular level without the effort of creating individual jobs. DataHub will automatically create a unique job for each folder in your hierarchy; inheriting configurations from the parent / master job.
The folder mapping job may be controlled and manipulated just like a transfer job but when executed it will not transfer data. Instead, each execution creates, modifies, or deletes its child jobs which are then responsible for the transfer of data.
For more information, please see Create New Job | Transfer Direction
User Account Mapping
Map user accounts on one connection to those on another. Individual jobs will be generated to transfer items between mapped accounts
User account mapping jobs are ideal for migrations where you wish to control the transfer at a granular level without the effort of creating individual jobs. DataHub will automatically create a unique job for each user account that can be matched between the source and destination connections.
Important! Network home drive mapping jobs require connections with the ability to impersonate other users.
For more information, please see Create New Job | Transfer Direction
Network Home Drive Mapping
Map Network home drives to users on another connection. Individual jobs will be generated to transfer items between the source folder and destination account
Network home drive mapping jobs are ideal for migrations where you wish to control the transfer at a granular level without the effort of creating individual jobs. DataHub will automatically create a unique job for each user account that can be matched between the source and destination connections.
Important! Network home drive mapping jobs require connections with the ability to impersonate other users.
For more information, please see Create New Job | Transfer Direction
Basic Transfer - Define Source & Destination Locations
All platform connections made in the DataHub Platform application will be available in the locations drop down lists when creating a job.
- If your connections were created with Administrative privileges, you may also have the ability to impersonate another user within your organization
- Source defines the location of your current content you wish to transfer
- Destination defines the location of where you would like your content to go
Folder Mapping - Define your Source and Destination Paths
If you are an administrator using Impersonation, enable Run as user... and choose the user you wish to access
All job features defined while creating the parent job, will be applied to the child jobs it creates
Source / Destination Path: If you wish to transfer all content, leave the source path blank. A child job will be created for every top level folder. If a folder is selected for the source path, a child job will be created for every sub folder within the parent folder.
Child Job Source / Destination Path: This directory within each folder will be used as the source.
Target the root of each folder: The child job will be created for the first level folder relative to the source path
Target a specific directory within each folder: If there is a folder that is exists in every directory, you can define it with this option
User Account Mapping - Define your Source and Destination Paths
If you are an administrator using Impersonation, enable Run as user... and choose the user you wish to access
Source / Destination Path: If you wish to transfer all content, leave the source path blank. A child job will be created for every user account. If a director is selected for the source path, a child job will be created for every user account within the parent directory.
Child Job Source / Destination Path: This directory within each user account will be used as the source. If the directory does not exist on the destination, DataHub will create it.
Target the root of each user account: The child job will be created for the first level folder/account relative to the source path
Target a specific directory within each user account: If there is a folder that is exists in every user account, you can define it with this option
Network Home Drive Mapping - Define your Source and Destination Paths
If you are an administrator using Impersonation, enable Run as user... and choose the user you wish to access
Source / Destination Path: If you wish to transfer all content, leave the source path blank. A child job will be created for every home drive. If a directory is selected for the source path, a child job will be created for every home drive within the parent directory.
Child Job Source / Destination Path: This directory within each folder will be used as the source. If the directory does not exist on the destination, DataHub will create it.
Target the root of each folder: The child job will be created for the first level folder/account relative to the source path
Target a specific directory within each folder: If there is a folder that is exists in every user account, you can define it with this option
Configuring Your Locations - Impersonation
Impersonation allows a site admin access to all the folders on the site, including those that belong to other users. With DataHub, a job can be setup using the username and password of the site admin to sync / migrate / copy files to or from a different user's account without ever having the username or password of that user.
How and why would I use impersonation?
This allows a site admin access to all the folders on the site, including those that belong to other users. Within DataHub, a job can be setup using the username and password of the site admin to sync / migrate / copy files to or from a different user's account without ever having the username or password of that user.
Enable Run as user...
Choose Source User
Job Category
The category function allows for the logical grouping of jobs for reporting and filtering purposes. The category is optional and does not alter the job function in any way.
DataHub comes with two default job categories:
Maintenance: DataHub maintenance jobs only. This category allows you to view the report of background maintenance jobs and is not intended for newly created transfer jobs
Default: When a category is not defined during job creation, it will automatically be given the default category. This option allows you to create a report for all jobs that a custom category was not assigned
Create Job Category
Enable feature and select from existing job categories or create a new category
From the jobs grid, filter by category
Job Policies
Define what should happen once items have been successfully transferred and set up rules around how to deal with content as it is updated on your resources while the job is running.
- DataHub works on the concept of “deltas” where the transfer engine only transfers files after they have been updated
- File version conflicts occur when the same file on the source and destination platforms have been updated in between job executions
- Policies define how DataHub handles file version conflicts and whether or not it persists a detected file deletion.
- Each job has its own policies defined and the settings are NOT global across all jobs
Conflict Policy - File Version Conflicts
When a conflict is detected on either the source or the destination, Conflict Policy determines how DataHub will behave
For more information, please see: Conflict Policy
Delete Policy - Deleted Items
When a delete is detected on either the Source or the Destination, Delete Policy determines how DataHub will behave.
For more information, please see: Delete Policy
Behaviors
Determine how this job should execute, and what course of action to take in different scenarios. All behaviors are enabled by default; as recommended settings to ensure content is transferred successfully to the destination.
Zip Unsupported Files / Restricted Content
Enabling this behavior allows DataHub to compress any file that is not supported on the destination into a .zip format before being transferred. This will be done instead of flagging the item for manual remediation and halting the transfer of the file.
For example, if you attempt to transfer the file "db123.cmd" from a Network File Share to Sharepoint, DataHub will compress the file to "db123.zip" before transferring it over, and thus avoiding an error message.
Allow unsupported file names to be changed
Segment Transformation policy controls if DataHub can change folder and file names to comply to platform's restrictions.
Enabling this behavior allows DataHub to change the names of folders and files that contain characters that are not supported by the destination before transferring the file. This will be done instead of flagging the file for manual remediation and preventing it from being transferred.
When this occurs the unsupported character will be transformed into a underscore.
For example, if you attempted to transfer the file "Congrats!.txt" from box to NFS it would be transformed to "Congrats_.txt" and appear that way on the destination.
Preserve file versioning between locations
DataHub will preserve and transfer all versions of a file on supported platforms
Define Job Schedule
During job creation, the final step is to define when the job will run and what criteria will define when it stops.
- Save job will launch the job scheduler
- Save job and run it right now will trigger the job to start immediately. It will run every 15 mins after the last execution completes
Schedule Stop Policies
Stop policies determine when a job should stop running. If none of the stop policies are enabled, a scheduled job will continue to run until it is manually stopped or removed.
The options for the stop policy are:
Stop after a number of total runs
The number of total executions before the job will move to "complete" status
Stop after a number of runs with no changes
The job has run and detected no further changes; all content has transferred successfully.
If new content is added to the source and the job runs again, this will not increment your stop policy count. However, job executions that detect no changes do not need to be consecutive to increment your stop policy count.
Stop after a number of failures
Most failures are resolved through automatic retries. If the retries fail to resolve the failures, then manual intervention is required. This policy takes the job out of rotation so that the issue can be investigated.
Job executions that detect failures do not need to be consecutive to increment your stop policy count.
Stop after a specific date
The job will "complete" on the date defined
Job Summary - Review your job configuration
Before you create your job, review all your configurations and adjust as needed. Modifying your job after creation is not supported; however, the option to duplicate your current job will allow you to make any adjustments without starting from the beginning.
- The Edit option will take you directly to the configuration to make changes