aboutsummaryrefslogtreecommitdiffstats
path: root/bin/README.txt
blob: a3d26559bb0bb3120842f8c5d0db23680d8157d0 (plain)
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
================================ INTRODUCTION ==================================
Security Response Tool:
- Collection of utilities to manage, update, and secure SRT database

common/srtool_utils.py  - schedules updates and manages database backups and restores
nist/srtool_nist.py     - gets CVEs from NIST
mitre/srtool_mitre.py   - gets CVEs from Mitre
debian/srtool_debian.p  - gets CVEs from Debian
redhat/srtool_redhat.p  - gets CVEs from Red Hat
common/get_updates.py   - cron job script that checks each datasource for updates according to their frequencies

=================================== LOGGING =====================================
All logs stored in the "update_logs" top level directory

  master_log:
      - tracks every command called and their completion statuses (including error messages)
      - never written over

  routine_update_log_(weeknum)_(daynum):
      - tracks what cron job's checks and updates
      - saves in weekly wheel

  update_nist_log_(weeknum)_(daynum):
      - tracks each NIST update
      - each individual CVE looked at has both ID and status recorded
      - saves in weekly wheel

==================================== FILES =====================================

* common/srtool_utils.py:

  Backups:  - called with '-B' flag   ...   example: "./bin/srtool_utils.py -B"
            - converts subset of database to JSON (ex. CVEs exist on NIST's server, no need to fully backup)
            - Adding whole table to backup
                   1. Modify WHOLE_TABLES list to contain name of table
            - Backing subset of table's columns (ex. only want to save name of CVE record instead of everything...)
                   1. Modify SUBSET_TABLES to contain (table_name, effective_primary_key) tuple where effective_primary_key can be used to uniquely identify the record
                   2. Create helper method that returns list of json dictionaries where each dictionary is a record (look at encode_cve_to_json() and encode_user_to_json() for examples)
                       - dict_factory() and setting conn.row_factory = dict_factory are VERY helpful (essential?)
                   3. Call the helper method from inside backup_db_json()
                       - be sure save the dictionary it returns to db['your_table_name']
            - Backups stored in 'backups' directory and labeled by both weeknum and day of week
                   -  Saves in a weekly wheel

            Methods: backup_db_json(), dict_factory(), encode_cve_to_json(), encode_user_to_json()

  Restore:  - called with '-R (weeknum) (weekday)'    ...   example: "./bin/srtool_utils.py -R 25 4"    --->    "restore from ./backups/backup_25_4"
            - restores database from JSON file located in backups directory
            - ASSUMES database exists and is already populated from data sources (latter important for subset tables)
            - When restoring database, must start from scratch or else "unique id contraint error" will be thrown by SQLite
                - Can use simulate_corruption() to enter scratch state -- USE WITH CAUTION
            Methods: restore_db_json()

  Updates:  - called with '-U' flag   ...   example: "./bin/srtools_utils.py -U"
            - goes through each DATASOURCE record in database and calls its 'command' field
            - useful for checking for updates ahead of schedule

            Methods: run_all_updates()

  Schedule: - called with '-T (datasource_description) (frequency) (time)'    ...   example: "./bin/srtool_utils.py -T 'NIST JSON 2018' 2 '01:30:25'"   --->    "update 'NIST JSON 2018' data once a day at 25 seconds into 1:30 AM"
            - configures the specified datasource's update schedule
            - time must be in '%H:%M:%S' format
            - frequency is [0, 1, 2, 3, 4, 5] for updating once each minute, hour, day, week, month, and year respectively

            Methods: configure_ds_update()

* nist/srtool_nist.py:

  Update:       - called with '-u (datasource_description)'   ...   example: "./bin/srtool_nist.py -u 'NIST JSON 2018'"    --->    "update the 'NIST JSON 2018' data right now"
                - checks the specified NIST datasource for updates
                - use '-U' flag calls the incremental update NIST feed

                Methods: update_nist(), nist_jason(), sql_cve_query(), sql_reference_query(), sql_cwe_query(), sql_cve2cwe_query()
                Classes: Cve(), Reference()

* common/get_updates.py:

  - This is the cron job script that checks every datasource for updates according to their frequencies
  - Calls the above functions and scripts by using the 'command' field of each datasource record