You shouldn't have to babysit all of the file copies; scheduled tasks is perfect for automating this job.
Copying files to another folder or server is a trivial task, no matter how you do it. There are a number of ways to get the job done: dragging and dropping the file in Windows Explorer, Copy-Item with PowerShell or the simple copy command in DOS. It's just a matter of specifying a source and a destination path and setting a few other optional parameters. It's only when you start copying a lot of files on a frequent basis that you run into problems.
When automating file copies, especially in a Windows environment, your go-to scripting language is going to be Windows PowerShell. If you need to quickly copy one or more files from one folder to another, PowerShell is a great way to do that. Also, not only is it easy to manually kick off PowerShell scripts, but you can also trigger transfers via PowerShell scripts by using Windows scheduled tasks.
In this article, we'll go over how to perform file transfers using PowerShell by writing a script and creating a scheduled task to kick off that script on a recurring basis. But before we start, I'm going to assume that you have at least PowerShell v4 installed on your computer. Otherwise, the tricks I'm about to show you may not work properly.
First you need to create a script to perform file transfers. Let's call the script CopyFiles.ps1. This script will contain the following code:
Copy-Item –Path $SourcePath –Destination $DestinationPath -Recurse
As you can see, the script is simple, but it leaves room for lots of customization depending on your environment.
The most complicated part of this script is the param() section. This is a parameter block containing two parameters: SourcePath and DestinationPath. By making both of these values, parameters allows us to pass in different values to our script so we can reuse it. If SourcePath and DestinationPath were actual paths, we'd have to create separate scripts for every different file copy!
Manually kicking off this script will look something like this:
& .\CopyFiles.ps1 –SourcePath C:\Source –DestinationPath \\SERVER\Destination
This example would copy all files and subfolders in the C:\Source folder to the \\SERVER\Destination shared folder.
Now that you have your CopyFiles.ps1 PowerShell script, head over to the computer where you'd like to kick it off. In this example, we're going to create a scheduled task to run this script once a day at 3 a.m.
You could create scheduled tasks by running the Task Scheduler GUI and creating one that way, but we're all about automation here. Let's learn how to create the scheduled task in PowerShell as well. To do this, you'll need to complete four rough steps:
Create the scheduled task action.
Create the trigger.
Create the scheduled task in memory.
Create the scheduled task on the computer.
Here's what that looks like in practice. First, we'll create the scheduled task action. This defines the EXE to run along with any arguments. Here, I'm assuming that your script is located at C:\CopyFiles.ps1.
$Action = New-ScheduledTaskAction -Execute
'C:\Windows\System32\WindowsPowerShell\v1.0\powershell.exe' -Argument "-NonInteractive -NoLogo
-NoProfile -File 'C:\CopyFiles.ps1' –SourcePath 'C:\Source –DestinationPath '\\SERVER\Destination'"
Next, we'll create a trigger to kick it off at 3 a.m. every day.
$Trigger = New-ScheduledTaskTrigger -Daily -At '3AM'
Next, we'll create the scheduled task in memory using the action and trigger that we just created.
$Task = New-ScheduledTask -Action $Action -Trigger $Trigger -Settings (New-ScheduledTaskSettingsSet)
Finally, we'll actually create the scheduled task on the system, calling it File Transfer Automation and running it under the local administrator account with the provided password.
$Task | Register-ScheduledTask -TaskName 'File Transfer Automation' -User 'administrator' -Password 'supersecret'
This would register the script, and it will now copy all files from your source to the destination server every day at 3 a.m. does-the-us-need-a-constitutional-amendment-for-data-privacy
Adam Bertram is a 20-year veteran of IT. He’s currently an automation engineer, blogger, independent consultant, freelance writer, author, and trainer. Adam focuses on DevOps, system management, and automation technologies as well as various cloud platforms. He is a Microsoft Cloud and Datacenter Management MVP and efficiency nerd that enjoys teaching others a better way to leverage automation.
Let our experts teach you how to use Sitefinity's best-in-class features to deliver compelling digital experiences.Learn More
Subscribe to get all the news, info and tutorials you need to build better business apps and sites
You can also ask us not to share your Personal Information to third parties here: Do Not Sell or Share My Info
We see that you have already chosen to receive marketing materials from us. If you wish to change this at any time you may do so by clicking here.
Thank you for your continued interest in Progress. Based on either your previous activity on our websites or our ongoing relationship, we will keep you updated on our products, solutions, services, company news and events. If you decide that you want to be removed from our mailing lists at any time, you can change your contact preferences by clicking here.