Administration Guides

Eyeglass Search & Recover - Archive Solution with Dell EMC ECS from Windows Host

Home

Eyeglass Search & Recover - Scripted Archive Solution with Dell EMC ECS (s3cmd and SMB)


Overview

This solution allows simple scripted upload of data into ECA using Search’s powerful searching and script creation feature to simplify the process of bulk copy or move operations into object storage.  The tested solution uses SMB to read the files and send over the S3 protocol to ECS storage bucket for archive.

Solution Test Environment

  • Eyeglass Search & Recover

  • Dell EMC PowerScale (Source)

  • Dell EMC ECS v3.3 (Archive)

  • s3cmd (Windows platform)

ECS Configuration

  1. Create a new test user: test1

(Manage →  Users → New Object Users)

  1. Next to Add password → Generate & Add Secret Key (S3)

Show Secret Key to get the key

  1. Create Bucket

Manage → Buckets → New Bucket

Set the Bucket Name, e.g. Bucket2

And set the owner → test1 user

s3cmd installation

  1. On a Windows machine, install python (python 2.7 or higher version)
  2. Set the environment variable for the path to this python installation directory (e.g. c:\Python27)
  3. Download s3cmd from http://s3tools.org/download
  4. Extract to c:\s3cmd directory (e.g. c:\s3cmd-2.0.2). Change directory to this directory cd c:\s3cmd-2.0.2
  5. Configure s3cmd by using this command:
  6. python s3cmd --configure 
  7. (it may prompt a warning about missing python-dateutil module. To install this python-dateutil module, change directory to python script directory (e.g cd c:\Python27\Scripts\ and then run pip install python-dateutil command to install that module. Once installed, re-run the s3cmd configuration command from c:\s3cmd-2.0.2 directory)
  8. Specify the following settings:
  9. Access Key: test1 (the S3 user that was created before)
  10. Secret Key: a4TMiLkD2S1CRu/vhmVnMrZv87vlOlakYWdq1Do7 (secret key for that s3 user)
  11. Default Region: US (accept default)
  12. S3 Endpoint: ecs25.ad1.test:9020  (Set the endpoint - DNS name /IP address of our ECS node)
  13. DNS-style bucket+hostname:port template for accessing a bucket:  
  14.  ecs25.ad1.test:9020  (set the bucket setting)
  15. Save the setting
  16. Example:
  17. c:\s3cmd-2.0.2>python s3cmd --configure
  18. ERROR: Option --preserve is not yet supported on MS Windows platform. Assuming --no-preserve.
  19. ERROR: Option --progress is not yet supported on MS Windows platform. Assuming --no-progress.
  20. Enter new values or accept defaults in brackets with Enter.
  21. Refer to user manual for detailed description of all options.
  22. Access key and Secret key are your identifiers for Amazon S3. Leave them empty for using the env variables.
  23. Access Key: test1
  24. Secret Key: a4TMiLkD2S1CRu/vhmVnMrZv87vlOlakYWdq1Do7
  25. Default Region [US]:
  26. Use "s3.amazonaws.com" for S3 Endpoint and not modify it to the target Amazon S3.
  27. S3 Endpoint [s3.amazonaws.com]: ecs25.ad1.test:9020
  28. Use "%(bucket)s.s3.amazonaws.com" to the target Amazon S3. "%(bucket)s" and "%(location)s" vars can be used
  29. if the target S3 system supports dns based buckets.
  30. DNS-style bucket+hostname:port template for accessing a bucket [%(bucket)s.s3.amazonaws.com]: ecs25.ad1.test:9020
  31. Encryption password is used to protect your files from reading
  32. by unauthorized persons while in transfer to S3
  33. Encryption password:
  34. Path to GPG program:
  35. When using secure HTTPS protocol all communication with Amazon S3
  36. servers is protected from 3rd party eavesdropping. This method is
  37. slower than plain HTTP, and can only be proxied with Python 2.7 or newer
  38. Use HTTPS protocol [Yes]: No
  39. On some networks all internet access must go through a HTTP proxy.
  40. Try setting it here if you can't connect to S3 directly
  41. HTTP Proxy server name:
  42. New settings:
  43.   Access Key: test1
  44.   Secret Key: a4TMiLkD2S1CRu/vhmVnMrZv87vlOlakYWdq1Do7
  45.   Default Region: US
  46.   S3 Endpoint: ecs25.ad1.test:9020
  47.   DNS-style bucket+hostname:port template for accessing a bucket: ecs25.ad1.test:9020
  48.   Encryption password:
  49.   Path to GPG program: None
  50.   Use HTTPS protocol: False
  51.   HTTP Proxy server name:
  52.   HTTP Proxy server port: 0
  53. Test access with supplied credentials? [Y/n] Y
  54. Please wait, attempting to list all buckets...
  55. Success. Your access key and secret key worked fine :-)
  56. Now verifying that encryption works...
  57. Not configured. Never mind.
  58. Save settings? [y/N] y
  59. Configuration saved to 'C:\Users\administrator\AppData\Roaming\s3cmd.ini'
  60. Verify s3cmd, use this command to list bucket that belongs to that user
  61. c:\s3cmd-2.0.2>python c:\s3cmd-2.0.2\s3cmd ls
  62. 2019-06-14 09:33  s3://bucket2

Create PowerScale SMB Share

  1. Create PowerScale SMB sharefor the folder’s contents will be archived to ECS.

  2. Example of the path: /ifs/data/search2/

  3. And assign the user that will run the s3cmd command to have the permission to read (for upload) and for deleting file after copy to cloud, require read and write permission

Create Scripts from Eyeglass Search & Recover for Push to ECS and Delete From PowerScale

  1. The search used to locate files can be any search including content aware searches or age based search using last accessed or last modified data stamps on files or any combination.
    1. Login to Eyeglass Search & Recover
    2. Search the folder that the contents will be archived. Example: folder1
    3. Click the CMD Writer icon
    4. Define the Script content as follows:
    5. cmd

       Select

      cmd

      Script Format

      Surround file location with quotes

      python c:\s3cmd-2.0.2\s3cmd put

      File Location

       s3://<bucket-name>

      Plain

      Checked

      cmd

       Select

      cmd

      Script Format

      Surround file location with quotes

      python c:\s3cmd-2.0.2\s3cmd put

      File Location

       s3://bucket2

      Plain

      Checked

    6. Example:
    7. Click “Create For All” button
    8. Open the script use text editor tools and save it as a batch file  
      1. Example:
      2. # -Content: folder1
      3.  python c:\s3cmd-2.0.2\s3cmd put "\\rnsm04-c08.ad1.test\searchfolder2\folder1\10.txt" s3://bucket2  
      4. python c:\s3cmd-2.0.2\s3cmd put "\\rnsm04-c08.ad1.test\searchfolder2\folder1\1.txt" s3://bucket2  
      5. python c:\s3cmd-2.0.2\s3cmd put "\\rnsm04-c08.ad1.test\searchfolder2\folder1\2.txt" s3://bucket2  
      6. python c:\s3cmd-2.0.2\s3cmd put "\\rnsm04-c08.ad1.test\searchfolder2\folder1\3.txt" s3://bucket2  
      7.  python c:\s3cmd-2.0.2\s3cmd put "\\rnsm04-c08.ad1.test\searchfolder2\folder1\9.txt" s3://bucket2  
      8. python c:\s3cmd-2.0.2\s3cmd put "\\rnsm04-c08.ad1.test\searchfolder2\folder1\5.txt" s3://bucket2  
      9. python c:\s3cmd-2.0.2\s3cmd put "\\rnsm04-c08.ad1.test\searchfolder2\folder1\8.txt" s3://bucket2  
      10. python c:\s3cmd-2.0.2\s3cmd put "\\rnsm04-c08.ad1.test\searchfolder2\folder1\4.txt" s3://bucket2  
      11. python c:\s3cmd-2.0.2\s3cmd put "\\rnsm04-c08.ad1.test\searchfolder2\folder1\6.txt" s3://bucket2  
      12. python c:\s3cmd-2.0.2\s3cmd put "\\rnsm04-c08.ad1.test\searchfolder2\folder1\7.txt" s3://bucket2
      13. Example of the output:
        1. c:\s3cmd-2.0.2>python c:\s3cmd-2.0.2\s3cmd put "\\rnsm04-c08.ad1.test\searchfolder2\folder1\3.txt" s3://bucket2
        2. WARNING: Module python-magic is not available. Guessing MIME types based on file extensions.
        3. upload: '\\rnsm04-c08.ad1.test\searchfolder2\folder1\3.txt' -> 's3://bucket2/3.txt' (15 bytes in 0.6 seconds, 23.40 B/s) [1 of 1]
        4. c:\s3cmd-2.0.2>python c:\s3cmd-2.0.2\s3cmd put "\\rnsm04-c08.ad1.test\searchfolder2\folder1\4.txt" s3://bucket2
        5. WARNING: Module python-magic is not available. Guessing MIME types based on file extensions.
        6. upload: '\\rnsm04-c08.ad1.test\searchfolder2\folder1\4.txt' -> 's3://bucket2/4.txt' (15 bytes in 1.2 seconds, 12.47 B/s) [1 of 1]
    9. To verify the content of the bucket, after file copy has been completed, run this command:
      1. Python c:\s3cmd-2.0.2\s3cmd ls s3:\\bucket2
      2. Example of the output:
      3. c:\s3cmd-2.0.2>python c:\s3cmd-2.0.2\s3cmd ls s3://bucket2
      4. 2019-06-27 06:34        15   s3://bucket2/1.txt
      5. 2019-06-27 07:34        15   s3://bucket2/2.txt
      6. 2019-06-27 07:54        15   s3://bucket2/3.txt
      7. 2019-06-27 07:54        15   s3://bucket2/4.txt
    10. Verify that the files are copied to ECS bucket

      (Example:. dir \\rnsm04-c08.ad1.test\searchfolder2\folder1)

      1. Create another script to remove the files from PowerScale SMB share
      2. Repeat step#3, and define the command as follows:
      3. cmd

         Select

        cmd

        Script Format

        Surround file location with quotes

        del

        File Location

         

        Plain

        Checked

      4. Click “Create For All” button
      5. Open the script use text editor tools and save it as a batch file
        1. Example:
        2. # -Content: folder1
        3. del "\\rnsm04-c08.ad1.test\searchfolder2\folder1\10.txt"  
        4. del "\\rnsm04-c08.ad1.test\searchfolder2\folder1\1.txt"  
        5. del "\\rnsm04-c08.ad1.test\searchfolder2\folder1\2.txt" 
        6. del "\\rnsm04-c08.ad1.test\searchfolder2\folder1\9.txt" 
        7. del "\\rnsm04-c08.ad1.test\searchfolder2\folder1\3.txt"  
        8. del "\\rnsm04-c08.ad1.test\searchfolder2\folder1\5.txt"  
      6. Run that batch file script to remove files from PowerScale
      7. Verify that files are no longer on PowerScale
© Superna LLC