Compare commits

..

No commits in common. "master" and "feature/logfile-purge-support" have entirely different histories.

2 changed files with 8 additions and 57 deletions

View File

@ -12,22 +12,20 @@
## 🚀 Overview ## 🚀 Overview
This script fetches enforcement data from an external API, truncates a specified feature layer in ArcGIS, and adds the fetched data as features to the layer. It also logs the operation, saves data to JSON files, and optionally purges old files. Additionally, it supports reloading data from a JSON file without making API calls. This script fetches enforcement data from an external API, truncates a specified feature layer in ArcGIS, and adds the fetched data as features to the layer. It also logs the operation, saves data to JSON files, and optionally purges old files.
--- ---
## 📦 Requirements ## 📦 Requirements
- Python 3.6 or higher (if using the Python script) - Python 3.6 or higher
- Required packages in `requirements.txt` - Required packages in `requirements.txt`
- `.env` file with your configuration - `.env` file with your configuration
- ArcGIS Online credentials - ArcGIS Online credentials
--- ---
## ⚙️ Installation ## 🔧 Installation
### Python Script
```bash ```bash
pip install -r requirements.txt pip install -r requirements.txt
@ -39,10 +37,6 @@ Or install packages individually:
pip install requests python-dotenv pip install requests python-dotenv
``` ```
### Windows Executable
A Windows executable is available for users who prefer not to run the script directly. You can download it from the [releases page](https://git.nickhepler.cloud/nick/753-Data-Sync/releases). This executable is compiled using PyInstaller and can be run without needing to install Python or any dependencies.
--- ---
## ⚙️ Configuration ## ⚙️ Configuration
@ -84,27 +78,15 @@ PURGE_DAYS=5
## 🧪 Script Usage ## 🧪 Script Usage
### Python Script
```bash ```bash
python 753DataSync.py --results_per_page 100 python 753DataSync.py --results_per_page 100
``` ```
### Windows Executable
Simply double-click the executable file to run it. You can also run it from the command line with:
```bash
753DataSync.exe --results_per_page 100
```
### CLI Arguments ### CLI Arguments
| Argument | Description | | Argument | Description |
|----------------------|---------------------------------------------| |----------------------|---------------------------------------------|
| `--results_per_page` | Optional. Number of results per API call (default: `100`) | | `--results_per_page` | Optional. Number of results per API call (default: `100`) |
| `--test` | Optional. If set, only fetch the first page of results. |
| `--reload` | Optional. Load data from a specified JSON file instead of fetching from the API. |
--- ---
@ -117,8 +99,6 @@ Simply double-click the executable file to run it. You can also run it from the
5. **📤 Add Features** — Sends data to ArcGIS feature layer. 5. **📤 Add Features** — Sends data to ArcGIS feature layer.
6. **🧹 File Cleanup** — Deletes `.json`/`.log` files older than `PURGE_DAYS`. 6. **🧹 File Cleanup** — Deletes `.json`/`.log` files older than `PURGE_DAYS`.
7. **📑 Dynamic Logs** — Logs saved to `753DataSync_YYYY-MM-DD.log`. 7. **📑 Dynamic Logs** — Logs saved to `753DataSync_YYYY-MM-DD.log`.
8. **🧪 Test Mode** — Use the `--test` flag to fetch only the first page of results for testing purposes.
9. **🔄 Reload Data** — Use the `--reload` flag to truncate the feature layer and load data from a specified JSON file.
--- ---
@ -132,6 +112,7 @@ Simply double-click the executable file to run it. You can also run it from the
📄 753DataSync_2025-03-26.log 📄 753DataSync_2025-03-26.log
``` ```
--- ---
## 📝 Example Log ## 📝 Example Log
@ -171,12 +152,6 @@ python 753DataSync.py
# Run with custom page size # Run with custom page size
python 753DataSync.py --results_per_page 50 python 753DataSync.py --results_per_page 50
# Run the Windows executable with default page size
753DataSync.exe
# Run the Windows executable with custom page size
753DataSync.exe --results_per_page 50
``` ```
--- ---

32
app.py
View File

@ -146,16 +146,10 @@ def parse_arguments():
# Add arguments for results per page # Add arguments for results per page
parser.add_argument('--results_per_page', type=int, default=100, help="Number of results per page (default: 100)") parser.add_argument('--results_per_page', type=int, default=100, help="Number of results per page (default: 100)")
# Add a test flag
parser.add_argument('--test', action='store_true', help="If set, only fetch the first page of results.")
# Add a reload flag
parser.add_argument('--reload', type=str, help="If set, load data from the specified file instead of fetching from the API.")
# Parse the arguments # Parse the arguments
args = parser.parse_args() args = parser.parse_args()
return args.results_per_page, args.test, args.reload return args.results_per_page
def generate_token(username, password, url="https://www.arcgis.com/sharing/rest/generateToken"): def generate_token(username, password, url="https://www.arcgis.com/sharing/rest/generateToken"):
"""Generates an authentication token.""" """Generates an authentication token."""
@ -311,8 +305,8 @@ def main():
purge_old_files(purge_days) purge_old_files(purge_days)
# Parse command-line arguments # Parse command-line arguments
results_per_page, test_mode, reload_file = parse_arguments() results_per_page = parse_arguments()
logger.info(f"Parsed arguments: results_per_page={results_per_page}, test_mode={test_mode}, reload_file={reload_file}") logger.info(f"Parsed arguments: results_per_page={results_per_page}")
# Load environment variables # Load environment variables
logger.info("Loading environment variables.") logger.info("Loading environment variables.")
@ -336,22 +330,9 @@ def main():
fs = os.getenv('FS') fs = os.getenv('FS')
layer = os.getenv('LAYER') layer = os.getenv('LAYER')
logger.info("Truncating the feature layer.") # Truncate the layer before adding new features
truncate(token, hostname, instance, fs, layer) truncate(token, hostname, instance, fs, layer)
# If --reload flag is set, load data from the specified file
if reload_file:
logger.info(f"Reloading data from file: {reload_file}")
# Load data from the specified file
with open(reload_file, 'r', encoding='utf-8') as f:
aggregated_data = json.load(f)
# Add the features to the feature layer
response = add_features(token, hostname, instance, fs, layer, aggregated_data)
logger.info("Data reloaded successfully from the specified file.")
return
all_data = [] all_data = []
page_number = 1 page_number = 1
@ -375,11 +356,6 @@ def main():
logger.info("No more data to fetch, stopping pagination.") logger.info("No more data to fetch, stopping pagination.")
break break
# Break the loop if in test mode
if test_mode:
logger.info("Test mode is enabled, stopping after the first page.")
break
page_number += 1 page_number += 1
except Exception as e: except Exception as e:
logger.error(f"Error fetching or saving data for page {page_number}: {e}", exc_info=True) logger.error(f"Error fetching or saving data for page {page_number}: {e}", exc_info=True)