Powershell - Improve ForEach with jobs -


i have powershell script (i'm running psversion 4.0) cycles through files in particular directory , series of replaces on file's contents. script below:

#set path cycle through $pathtofiles = "e:\zac's docs\files\*"  # work if path exists if(test-path -path $pathtofiles) {     #do replacements     get-childitem -path "$pathtofiles" -include "*.csv" -exclude "cleaned - *.csv" | `     foreach-object{          # first replace changes double quotes part of text escaped double quote (i.e. "")         # second replace changes single quotes (which assumed text qualifiers) custom text qualifer (i.e. |~|)         # third replace changes escaped double quotes (i.e. "") single double quote (i.e. ") encode in utf8.         gc $_.fullname -encoding utf8 |          % {$_ -replace "`"(.*?)`"(?!`,)", "`"`"`$1`"`"" `               -replace "(?<!`")`"{1}(?!`")", "|~|" `               -replace "(`"`")", "`""} |          sc -encoding utf8 ($_.directoryname+'\'+'cleaned - '+$_.basename+'.csv')          # move original file processed folder         $newlocation = $_.directoryname + "\processed\" + $_.basename + ".csv"         mi $_.fullname $newlocation     } } 

this script running more like, i've been trying update make use of background jobs described in several places:

however, have far been unsuccessful in improving script in way or have run completion. can see bunch of jobs starting, nothing happens , have kill powershell ise through task manager. i'm sure need add control number of jobs created, too, i'm not sure how. best attempt @ new script below:

$pathtofiles = $pathtofiles = "e:\zac's docs\npdd\xboxonecsv\ubisoft\*"  $files = get-childitem -path "$pathtofiles" -include "*.csv" -exclude "cleaned - *.csv"  $scriptblock = {     param($file)      # second replace changes single quotes (which assumed text qualifiers) custom text qualifer (i.e. |~|)     # third replace changes escaped double quotes (i.e. "") single double quote (i.e. ") encode in utf8.     gc $file.fullname -encoding utf8 |      % {$_ -replace "`"(.*?)`"(?!`,)", "`"`"`$1`"`"" `           -replace "(?<!`")`"{1}(?!`")", "|~|" `           -replace "(`"`")", "`""} |      sc -encoding utf8 ($file.directoryname+'\'+'cleaned - '+$file.basename+'.csv')      # move original file processed folder     $newlocation = $file.directoryname + "\processed\" + $file.basename + ".csv"     mi $file.fullname $newlocation }  $files | % {start-job -scriptblock $scriptblock -argumentlist $_ | out-null} get-job | wait-job | receive-job 

any assistance welcomed. thanks!

apparently hitting disk's performance limit if running process single-threaded, splitting process multiple jobs slow down operation due concurrent read/write requests different files on single hard disk. stick single-threaded action, way reads , writes sequential, better type of underlying disk hardware.


Comments

Popular posts from this blog

python - pip install -U PySide error -

arrays - C++ error: a brace-enclosed initializer is not allowed here before ‘{’ token -

cytoscape.js - How to add nodes to Dagre layout with Cytoscape -