salt salt* state.sls os_modifications.repo_update --output-diff
pip3 install gitpython
salt salt-master* cmd.run 'yum clean all ; yum makecache fast'
salt salt* cmd.run 'yum check-update'
salt salt* pkg.upgrade name=salt-master
salt salt* state.sls salt_master.salt_posix_acl --output-diff
salt salt* cmd.run 'systemctl restart salt-master'
salt salt*com state.sls salt_master.salt_master_configs test=true
salt salt* state.sls os_modifications.repo_update --output-diff
salt salt* cmd.run 'yum clean all ; yum makecache fast'
salt salt* cmd.run 'yum check-update'
salt salt* pkg.upgrade name=salt-minion
watch 'salt salt* test.ping'
salt cmd.run 'pip3 install boto'
salt cmd.run 'pip3 install boto3'
salt cmd.run 'pip3 install pyinotify'
salt saltutil.sync_all
salt saltutil.refresh_modules
salt grains.get ec2:placement:availability_zone
salt grains.get environment
service.restart salt-minion
cmd.run 'tail /var/log/salt/minion'
salt sensu* pkg.upgrade name=salt-minion
salt vault*local pkg.upgrade name=salt-minion
salt moose*local pkg.upgrade name=salt-minion
salt -C '* not ( moose* or afs* or nga* or ma-* or mo-* or la-* or dc-* or vault* or sensu* or interconnect* or resolver* or salt-master* )' pkg.upgrade name=salt-minion
salt -C 'resol* or interc*' pkg.upgrade name=salt-minion
salt-call -ldebug --local grains.get ec2_info salt-call -ldebug --local grains.get ec2_tags
boto and boto3 needs to be installed for py3 for ec2 grains pip3 install boto pip3 install boto3 pip3 list installed | grep boto
push out new grain that was updated for py3. fixes the ec2:placement:availability_zone grain salt *local saltutil.sync_all salt *com saltutil.sync_all salt *local grains.get ec2:placement:availability_zone salt *com grains.get ec2:placement:availability_zone
ISSUE: [ERROR ] Returner splunk.returner could not be loaded: 'splunk.returner' is not available. SOLUTION: manually restart minion
ISSUE: 2020-11-23 18:13:09,719 [salt.beacons :144 ][WARNING ][15141] Unable to process beacon inotify cmd.run 'ls -larth /etc/salt/minion.d/beacons.conf'
ISSUE: requests.packages.urllib3.exceptions.MaxRetryError: HTTPSConnectionPool(host='iratemoses.mdr.defpoint.com', port=8088): Max retries exceeded with url: /services/collector/event (Caused by NewConnectionError(': Failed to establish a new connection: [Errno -2] Name or service not known',))
SOLUTION: IGNORE: this was happening with previous version of salt and python2.
ISSUE on reposerver: 2020-11-23 19:42:20,061 [salt.state :328 ][ERROR ][18267] Cron /usr/local/bin/repomirror-cron.sh for user root failed to commit with error
"/tmp/__salt.tmp.9b64eos8":1: bad minute
errors in crontab file, can't install.
SOLUTION: bad cron file?
ISSUE: [CRITICAL][1745] Pillar render error: Rendering SLS 'mailrelay' failed 2020-11-23 19:26:11,255 [salt.pillar :889 ][CRITICAL][1745] Rendering SLS 'mailrelay' failed, render error: Jinja variable 'salt.utils.context.NamespacedDictWrapper object' has no attribute 'ec2' Traceback (most recent call last): File "/usr/lib/python3.6/site-packages/salt/utils/templates.py", line 400, in render_jinja_tmpl
output = template.render(**decoded_context)
File "/usr/lib/python3.6/site-packages/jinja2/environment.py", line 989, in render
return self.environment.handle_exception(exc_info, True)
File "/usr/lib/python3.6/site-packages/jinja2/environment.py", line 754, in handle_exception
reraise(exc_type, exc_value, tb)
File "/usr/lib/python3.6/site-packages/jinja2/_compat.py", line 37, in reraise
raise value.with_traceback(tb)
File "", line 1, in top-level template code File "/usr/lib/python3.6/site-packages/jinja2/environment.py", line 389, in getitem
return obj[argument]
jinja2.exceptions.UndefinedError: 'salt.utils.context.NamespacedDictWrapper object' has no attribute 'ec2'
SOLUTION: ?
https://jira.xdr.accenturefederalcyber.com/browse/MSOCI-1164
Done when:
All salt minions are running same version (2018) All server minions are pegged to specific version (that can be changed at upgrade time) Remove yum locks for minion
Notes:
Packer installs 2019 repo (packer/scripts/add-saltstack-repo.sh & packer/scripts/provision-salt-minion.sh) , then os_modifications ( os_modifications.repo_update) overwrites the repo with 2018. This leaves the salt minion stuck at the 2019 version without being able to upgrade.
#salt master (two salt repo files)
/etc/yum.repos.d/salt.repo (salt/fileroots/os_modifications/minion_upgrade.sls)
[salt-2018.3] name=SaltStack 2018.3 Release Channel for Python 2 RHEL/Centos $releasever baseurl=https://repo.saltstack.com/yum/redhat/7/$basearch/2018.3 failovermethod=priority enabled=1 /etc/yum.repos.d/salt-2018.3.repo
[salt-2018.3] name=SaltStack 2018.3 Release Channel for Python 2 RHEL/Centos $releasever baseurl=https://repo.saltstack.com/yum/redhat/7/$basearch/2018.3 failovermethod=priority enabled=1 gpgcheck=1 gpgkey=file:///etc/pki/rpm-gpg/saltstack-signing-key, file:///etc/pki/rpm-gpg/centos7-signing-key
#reposerver.msoc.defpoint.local /etc/yum.repos.d/salt.repo
[salt-2018.3] name=SaltStack 2018.3 Release Channel for Python 2 RHEL/Centos $releasever baseurl=https://repo.saltstack.com/yum/redhat/7/$basearch/2018.3 failovermethod=priority enabled=1 gpgcheck=0 Two repo files in salt, both are 2018.3; one has proxy=none other doesn't. the salt_rhel.repo is just for RHEL and the other is for CENTOS.
salt/fileroots/os_modifications/files/salt.repo (salt/fileroots/os_modifications/repo_update.sls uses this file and it is actively pushed to CENTOS minions)
salt/fileroots/os_modifications/files/salt_rhel.repo (salt/fileroots/os_modifications/repo_update.sls uses this file and it is actively pushed to RHEL minions)
/etc/yum.repos.d/salt-2018.3.repo ( not sure how this file is being pushed. possibly pushed from Chris fixing stuff )
STEPS
PROBLEMS bastion.msoc.defpoint.local error: unpacking of archive failed on file /var/log/salt: cpio: lsetfilecon mailrelay.msoc.defpoint.local pillar broken
PROD
PROBLEMS the pillar depends on a custom grain, the custom grain depends on specific python modules. the moose servers seem to have python module issues. these commands helped fix them. python yum VS. pip
ERROR: Could not get AWS connection: global name 'boto3' is not defined ERROR: ImportError: cannot import name certs pip list | grep requests yum list installed | grep requests sudo pip uninstall requests sudo pip uninstall urllib3 sudo yum install python-urllib3 sudo yum install python-requests pip install boto3 (this installs urllib3 via pip as a dependency!) pip install boto
slsutil.renderer salt://os_modifications/repo_update.sls if the grain is wrong on the salt master, but correct with salt-call restart the minion.
salt moose* grains.item environment cmd.run 'salt-call grains.get environment' cmd.run 'salt-call -ldebug --local grains.get environment' cmd.run 'salt-call -lerror --local grains.get environment'
Boto3 issue is actually a urllib3 issue?
pip -V
pip list | grep boto
pip list | grep urllib3
salt-call is different connecting to python2 /bin/bash: pip: command not found salt 'mooseindexer' cmd.run "salt-call cmd.run 'pip install boto3'"
resolution steps Duane will remove /usr/local/bin/pip which is pointing to python3 pip should be at /usr/bin/pip yum --enablerepo=epel -y reinstall python2-pip
To Fix, upgrade the urllib3 module:
salt '*.local' cmd.run 'pip install --upgrade urllib3'
Permissions issue? Run this command as root: salt salt* state.sls salt_master.salt_posix_acl