Compare commits
No commits in common. "master" and "v2025.03" have entirely different histories.
232
README.md
232
README.md
@ -1,195 +1,105 @@
|
||||

|
||||
|
||||
# 753 Data Sync
|
||||
|
||||
*A Python-based data ingestion tool for syncing enforcement data from a public API to ArcGIS Online.*
|
||||
This script fetches enforcement data from an external API, truncates a specified feature layer in ArcGIS, and adds the fetched data as features to the layer. The script performs the following tasks:
|
||||
|
||||

|
||||

|
||||

|
||||
1. **Truncate the specified layer** in ArcGIS to clear any previous features before adding new ones.
|
||||
2. **Fetch data** from an API in paginated form.
|
||||
3. **Save data** from each API response to individual JSON files.
|
||||
4. **Aggregate all data** from all pages into one JSON file.
|
||||
5. **Add the aggregated data** as features to an ArcGIS feature service.
|
||||
|
||||
---
|
||||
## Requirements
|
||||
|
||||
## 🚀 Overview
|
||||
- Python 3.6 or higher
|
||||
- Required Python packages (see `requirements.txt`)
|
||||
- ArcGIS Online credentials (username and password)
|
||||
- `.env` file for configuration (see below for details)
|
||||
|
||||
This script fetches enforcement data from an external API, truncates a specified feature layer in ArcGIS, and adds the fetched data as features to the layer. It also logs the operation, saves data to JSON files, and optionally purges old files. Additionally, it supports reloading data from a JSON file without making API calls.
|
||||
### Install dependencies
|
||||
|
||||
---
|
||||
|
||||
## 📦 Requirements
|
||||
|
||||
- Python 3.6 or higher (if using the Python script)
|
||||
- Required packages in `requirements.txt`
|
||||
- `.env` file with your configuration
|
||||
- ArcGIS Online credentials
|
||||
|
||||
---
|
||||
|
||||
## ⚙️ Installation
|
||||
|
||||
### Python Script
|
||||
You can install the required dependencies using `pip`:
|
||||
|
||||
```bash
|
||||
pip install -r requirements.txt
|
||||
```
|
||||
|
||||
Or install packages individually:
|
||||
## Configuration
|
||||
|
||||
```bash
|
||||
pip install requests python-dotenv
|
||||
```
|
||||
|
||||
### Windows Executable
|
||||
|
||||
A Windows executable is available for users who prefer not to run the script directly. You can download it from the [releases page](https://git.nickhepler.cloud/nick/753-Data-Sync/releases). This executable is compiled using PyInstaller and can be run without needing to install Python or any dependencies.
|
||||
|
||||
---
|
||||
|
||||
## ⚙️ Configuration
|
||||
|
||||
Create a `.env` file in the root of your project:
|
||||
Before running the script, you'll need to configure some environment variables. Create a `.env` file with the following details:
|
||||
|
||||
```env
|
||||
API_URL=https://example.com/api
|
||||
AGOL_USER=your_username
|
||||
AGOL_PASSWORD=your_password
|
||||
HOSTNAME=www.arcgis.com
|
||||
INSTANCE=your_instance
|
||||
API_URL=your_api_url
|
||||
AGOL_USER=your_arcgis_online_username
|
||||
AGOL_PASSWORD=your_arcgis_online_password
|
||||
HOSTNAME=your_arcgis_host
|
||||
INSTANCE=your_arcgis_instance
|
||||
FS=your_feature_service
|
||||
LAYER=0
|
||||
LOG_LEVEL=DEBUG
|
||||
PURGE_DAYS=5
|
||||
LAYER=your_layer_id
|
||||
```
|
||||
|
||||
### Required Variables
|
||||
### Variables
|
||||
|
||||
| Variable | Description |
|
||||
|----------------|--------------------------------------------|
|
||||
| `API_URL` | The API endpoint to fetch data from |
|
||||
| `AGOL_USER` | ArcGIS Online username |
|
||||
| `AGOL_PASSWORD`| ArcGIS Online password |
|
||||
| `HOSTNAME` | ArcGIS host (e.g., `www.arcgis.com`) |
|
||||
| `INSTANCE` | ArcGIS REST instance path |
|
||||
| `FS` | Feature service name |
|
||||
| `LAYER` | Feature layer ID or name |
|
||||
- **API_URL**: The URL of the API you are fetching data from.
|
||||
- **AGOL_USER**: Your ArcGIS Online username.
|
||||
- **AGOL_PASSWORD**: Your ArcGIS Online password.
|
||||
- **HOSTNAME**: The hostname of your ArcGIS Online instance (e.g., `www.arcgis.com`).
|
||||
- **INSTANCE**: The instance name of your ArcGIS Online service.
|
||||
- **FS**: The name of the feature service you are working with.
|
||||
- **LAYER**: The ID or name of the layer to truncate and add features to.
|
||||
|
||||
### Optional Variables
|
||||
## Script Usage
|
||||
|
||||
| Variable | Description |
|
||||
|----------------|--------------------------------------------|
|
||||
| `LOG_LEVEL` | Log level (`DEBUG`, `INFO`, etc.) |
|
||||
| `PURGE_DAYS` | Number of days to retain logs and JSONs |
|
||||
|
||||
---
|
||||
|
||||
## 🧪 Script Usage
|
||||
|
||||
### Python Script
|
||||
You can run the script with the following command:
|
||||
|
||||
```bash
|
||||
python 753DataSync.py --results_per_page 100
|
||||
python 753DataSync.py --results_per_page <number_of_results_per_page>
|
||||
```
|
||||
|
||||
### Windows Executable
|
||||
### Arguments
|
||||
|
||||
Simply double-click the executable file to run it. You can also run it from the command line with:
|
||||
- **--results_per_page** (optional): The number of results to fetch per page (default: 100).
|
||||
|
||||
## Functionality
|
||||
|
||||
1. **Truncate Layer**: Before fetching and adding any new data, the script will call the `truncate` function to clear out any existing features from the specified layer. This ensures that the feature layer is empty and ready for the new data.
|
||||
|
||||
2. **Fetch Data**: The script will then fetch data from the specified API in pages. Each page is fetched sequentially until all data is retrieved.
|
||||
|
||||
3. **Save Data**: Data from each page will be saved to an individual JSON file, with the filename including the page number and timestamp. The aggregated data (all pages combined) is saved to a separate file.
|
||||
|
||||
4. **Add Features**: After all the data has been fetched and saved, the script will send the aggregated data as features to the specified ArcGIS feature layer.
|
||||
|
||||
### Example Output
|
||||
|
||||
- Individual page files are saved in the `data/` directory with filenames like `enforcement_page_1_results_100_2025-03-26_14-30-45.json`.
|
||||
- The aggregated file is saved as `aggregated_enforcement_results_2025-03-26_14-30-45.json`.
|
||||
|
||||
Logs will also be generated in the `753DataSync.log` file and printed to the console.
|
||||
|
||||
## Error Handling
|
||||
|
||||
- If an error occurs while fetching data, the script will log the error and stop execution.
|
||||
- If the `truncate` or `add_features` operations fail, the script will log the error and stop execution.
|
||||
- The script handles HTTP errors and network-related errors gracefully.
|
||||
|
||||
## Example Output (Log)
|
||||
|
||||
```bash
|
||||
753DataSync.exe --results_per_page 100
|
||||
```
|
||||
|
||||
### CLI Arguments
|
||||
|
||||
| Argument | Description |
|
||||
|----------------------|---------------------------------------------|
|
||||
| `--results_per_page` | Optional. Number of results per API call (default: `100`) |
|
||||
| `--test` | Optional. If set, only fetch the first page of results. |
|
||||
| `--reload` | Optional. Load data from a specified JSON file instead of fetching from the API. |
|
||||
|
||||
---
|
||||
|
||||
## 📋 Functionality
|
||||
|
||||
1. **🔁 Truncate Layer** — Clears existing ArcGIS features.
|
||||
2. **🌐 Fetch Data** — Retrieves paginated data from the API.
|
||||
3. **💾 Save Data** — Writes each page to a time-stamped JSON file.
|
||||
4. **📦 Aggregate Data** — Combines all pages into one file.
|
||||
5. **📤 Add Features** — Sends data to ArcGIS feature layer.
|
||||
6. **🧹 File Cleanup** — Deletes `.json`/`.log` files older than `PURGE_DAYS`.
|
||||
7. **📑 Dynamic Logs** — Logs saved to `753DataSync_YYYY-MM-DD.log`.
|
||||
8. **🧪 Test Mode** — Use the `--test` flag to fetch only the first page of results for testing purposes.
|
||||
9. **🔄 Reload Data** — Use the `--reload` flag to truncate the feature layer and load data from a specified JSON file.
|
||||
|
||||
---
|
||||
|
||||
## 📁 Example Output
|
||||
|
||||
```bash
|
||||
📁 data/
|
||||
├── enforcement_page_1_results_100_2025-03-26_14-30-45.json
|
||||
├── enforcement_page_2_results_100_2025-03-26_14-31-10.json
|
||||
└── aggregated_enforcement_results_2025-03-26_14-31-15.json
|
||||
|
||||
📄 753DataSync_2025-03-26.log
|
||||
```
|
||||
---
|
||||
|
||||
## 📝 Example Log
|
||||
|
||||
```text
|
||||
2025-03-26 14:30:45 - INFO - Attempting to truncate layer...
|
||||
2025-03-26 14:30:51 - INFO - Fetching page 1 from API...
|
||||
2025-03-26 14:30:55 - INFO - Saved data to data/enforcement_page_1_results_100_...
|
||||
2025-03-26 14:30:57 - INFO - Aggregated data saved.
|
||||
2025-03-26 14:30:45 - INFO - Attempting to truncate layer on https://www.arcgis.com/...
|
||||
2025-03-26 14:30:50 - INFO - Successfully truncated layer: https://www.arcgis.com/...
|
||||
2025-03-26 14:30:51 - INFO - Making request to: https://api.example.com/1/100
|
||||
2025-03-26 14:30:55 - INFO - Data saved to data/enforcement_page_1_results_100_2025-03-26_14-30-45.json
|
||||
2025-03-26 14:30:56 - INFO - No more data to fetch, stopping pagination.
|
||||
2025-03-26 14:30:57 - INFO - Data saved to data/aggregated_enforcement_results_2025-03-26_14-30-45.json
|
||||
2025-03-26 14:31:00 - INFO - Features added successfully.
|
||||
2025-03-26 14:31:01 - INFO - Deleted old log: 753DataSync_2025-03-19.log
|
||||
```
|
||||
|
||||
---
|
||||
## Troubleshooting
|
||||
|
||||
## 🛠 Troubleshooting
|
||||
- If the script stops unexpectedly, check the logs (`753DataSync.log`) for detailed error information.
|
||||
- Ensure your `.env` file is correctly configured with valid credentials and API URL.
|
||||
- Make sure the specified ArcGIS layer is accessible and has the correct permissions for truncating and adding features.
|
||||
|
||||
- Set `LOG_LEVEL=DEBUG` in `.env` for detailed logs.
|
||||
- Ensure `.env` has no syntax errors.
|
||||
- Make sure your ArcGIS layer has permission for truncation and writes.
|
||||
- Check for internet/API access and expired ArcGIS tokens.
|
||||
- Logs are written to both console and daily log files.
|
||||
## License
|
||||
|
||||
---
|
||||
|
||||
## 🧪 Testing
|
||||
|
||||
Currently, the script is tested manually. Automated testing may be added under a `/tests` folder in the future.
|
||||
|
||||
---
|
||||
|
||||
## 📖 Usage Examples
|
||||
|
||||
```bash
|
||||
# Run with default page size
|
||||
python 753DataSync.py
|
||||
|
||||
# Run with custom page size
|
||||
python 753DataSync.py --results_per_page 50
|
||||
|
||||
# Run the Windows executable with default page size
|
||||
753DataSync.exe
|
||||
|
||||
# Run the Windows executable with custom page size
|
||||
753DataSync.exe --results_per_page 50
|
||||
```
|
||||
|
||||
---
|
||||
|
||||
## 💬 Support
|
||||
|
||||
Found a bug or want to request a feature?
|
||||
[Open an issue](https://git.nickhepler.cloud/nick/753-Data-Sync/issues) or contact [@nick](https://git.nickhepler.cloud/nick) directly.
|
||||
|
||||
---
|
||||
|
||||
## 📜 License
|
||||
|
||||
This project is licensed under the [GNU General Public License v3.0](LICENSE).
|
||||
|
||||
> 💡 *You are free to use, modify, and share this project as long as you preserve the same license in your changes.*
|
||||
This project is licensed under the **GNU General Public License v3.0** or later - see the [LICENSE](LICENSE) file for details.
|
||||
306
app.py
306
app.py
@ -4,47 +4,24 @@ import sys
|
||||
import os
|
||||
import json
|
||||
from datetime import datetime
|
||||
from datetime import timedelta
|
||||
import argparse
|
||||
import urllib.parse
|
||||
from dotenv import load_dotenv
|
||||
import time
|
||||
|
||||
# Load environment variables from .env file
|
||||
load_dotenv("753DataSync.env")
|
||||
|
||||
# Configuration
|
||||
BASE_URL = "{}/{}/{}"
|
||||
log_level = os.getenv('LOG_LEVEL', 'INFO').upper()
|
||||
|
||||
# Get the current date for dynamic log file naming
|
||||
current_date = datetime.now().strftime("%Y-%m-%d")
|
||||
log_filename = f"753DataSync_{current_date}.log"
|
||||
|
||||
# Setup logging
|
||||
logger = logging.getLogger()
|
||||
logger.setLevel(logging.INFO)
|
||||
|
||||
# Set the log level for the logger
|
||||
if log_level == 'DEBUG':
|
||||
logger.setLevel(logging.DEBUG)
|
||||
elif log_level == 'INFO':
|
||||
logger.setLevel(logging.INFO)
|
||||
elif log_level == 'WARNING':
|
||||
logger.setLevel(logging.WARNING)
|
||||
elif log_level == 'ERROR':
|
||||
logger.setLevel(logging.ERROR)
|
||||
elif log_level == 'CRITICAL':
|
||||
logger.setLevel(logging.CRITICAL)
|
||||
else:
|
||||
logger.setLevel(logging.INFO)
|
||||
|
||||
# File handler for dynamic log file
|
||||
file_handler = logging.FileHandler(log_filename)
|
||||
file_handler.setLevel(getattr(logging, log_level))
|
||||
# File handler
|
||||
file_handler = logging.FileHandler('753DataSync.log')
|
||||
file_handler.setLevel(logging.INFO)
|
||||
|
||||
# Stream handler (console output)
|
||||
stream_handler = logging.StreamHandler(sys.stdout)
|
||||
stream_handler.setLevel(getattr(logging, log_level))
|
||||
stream_handler.setLevel(logging.INFO)
|
||||
|
||||
# Log format
|
||||
formatter = logging.Formatter('%(asctime)s - %(levelname)s - %(message)s')
|
||||
@ -55,64 +32,28 @@ stream_handler.setFormatter(formatter)
|
||||
logger.addHandler(file_handler)
|
||||
logger.addHandler(stream_handler)
|
||||
|
||||
def purge_old_files(purge_days):
|
||||
"""Purge log and data files older than PURGE_DAYS from the 'data' folder."""
|
||||
data_folder = 'data'
|
||||
log_folder = '.' # Log files are in the current directory
|
||||
|
||||
if not os.path.exists(data_folder):
|
||||
logger.warning(f"The '{data_folder}' folder does not exist.")
|
||||
return
|
||||
|
||||
purge_threshold = datetime.now() - timedelta(days=purge_days)
|
||||
|
||||
# Delete old log files
|
||||
for filename in os.listdir(log_folder):
|
||||
if filename.endswith(".log"):
|
||||
file_path = os.path.join(log_folder, filename)
|
||||
file_modified_time = datetime.fromtimestamp(os.path.getmtime(file_path))
|
||||
if file_modified_time < purge_threshold:
|
||||
logger.info(f"Deleting old log file: {file_path}")
|
||||
os.remove(file_path)
|
||||
|
||||
# Delete old data files
|
||||
for filename in os.listdir(data_folder):
|
||||
file_path = os.path.join(data_folder, filename)
|
||||
if filename.endswith(".json"):
|
||||
file_modified_time = datetime.fromtimestamp(os.path.getmtime(file_path))
|
||||
if file_modified_time < purge_threshold:
|
||||
logger.info(f"Deleting old data file: {file_path}")
|
||||
os.remove(file_path)
|
||||
|
||||
def fetch_data(api_url, page_number, results_per_page):
|
||||
"""Fetches data from the API and returns the response."""
|
||||
url = BASE_URL.format(api_url, page_number, results_per_page)
|
||||
|
||||
try:
|
||||
logger.info(f"Making request to: {url} with page_number={page_number} and results_per_page={results_per_page}")
|
||||
logger.info(f"Making request to: {url}")
|
||||
response = requests.get(url)
|
||||
|
||||
# Check for HTTP errors
|
||||
response.raise_for_status()
|
||||
|
||||
# Success log
|
||||
logger.info(f"Successfully fetched data from {url}. Status code: {response.status_code}.")
|
||||
|
||||
# Debug log with additional response details
|
||||
logger.debug(f"GET request to {url} completed with status code {response.status_code}. "
|
||||
f"Response time: {response.elapsed.total_seconds()} seconds.")
|
||||
|
||||
# Return JSON data
|
||||
return response.json()
|
||||
|
||||
except requests.exceptions.HTTPError as http_err:
|
||||
logger.error(f"HTTP error occurred while fetching data from {url}: {http_err}")
|
||||
logger.error(f"HTTP error occurred: {http_err}")
|
||||
sys.exit(1)
|
||||
except requests.exceptions.RequestException as req_err:
|
||||
logger.error(f"Request error occurred while fetching data from {url}: {req_err}")
|
||||
logger.error(f"Request error occurred: {req_err}")
|
||||
sys.exit(1)
|
||||
except Exception as err:
|
||||
logger.exception(f"An unexpected error occurred while fetching data from {url}: {err}")
|
||||
logger.error(f"An unexpected error occurred: {err}")
|
||||
sys.exit(1)
|
||||
|
||||
def save_json(data, filename):
|
||||
@ -121,22 +62,15 @@ def save_json(data, filename):
|
||||
# Ensure directory exists
|
||||
if not os.path.exists('data'):
|
||||
os.makedirs('data')
|
||||
logger.info(f"Directory 'data' created.")
|
||||
|
||||
# Save data to file
|
||||
with open(filename, 'w', encoding='utf-8') as f:
|
||||
json.dump(data, f, ensure_ascii=False, indent=4)
|
||||
|
||||
logger.info(f"Data successfully saved to {filename}")
|
||||
logger.info(f"Data saved to {filename}")
|
||||
|
||||
except OSError as e:
|
||||
logger.error(f"OS error occurred while saving JSON data to {filename}: {e}")
|
||||
sys.exit(1)
|
||||
except IOError as e:
|
||||
logger.error(f"I/O error occurred while saving JSON data to {filename}: {e}")
|
||||
sys.exit(1)
|
||||
except Exception as e:
|
||||
logger.error(f"Unexpected error occurred while saving JSON data to {filename}: {e}")
|
||||
logger.error(f"Error saving JSON data: {e}")
|
||||
sys.exit(1)
|
||||
|
||||
def parse_arguments():
|
||||
@ -146,16 +80,10 @@ def parse_arguments():
|
||||
# Add arguments for results per page
|
||||
parser.add_argument('--results_per_page', type=int, default=100, help="Number of results per page (default: 100)")
|
||||
|
||||
# Add a test flag
|
||||
parser.add_argument('--test', action='store_true', help="If set, only fetch the first page of results.")
|
||||
|
||||
# Add a reload flag
|
||||
parser.add_argument('--reload', type=str, help="If set, load data from the specified file instead of fetching from the API.")
|
||||
|
||||
# Parse the arguments
|
||||
args = parser.parse_args()
|
||||
|
||||
return args.results_per_page, args.test, args.reload
|
||||
return args.results_per_page
|
||||
|
||||
def generate_token(username, password, url="https://www.arcgis.com/sharing/rest/generateToken"):
|
||||
"""Generates an authentication token."""
|
||||
@ -168,36 +96,14 @@ def generate_token(username, password, url="https://www.arcgis.com/sharing/rest/
|
||||
'expiration': '120'
|
||||
}
|
||||
headers = {}
|
||||
|
||||
try:
|
||||
logger.info(f"Generating token for username '{username}' using URL: {url}")
|
||||
response = requests.post(url, headers=headers, data=payload)
|
||||
|
||||
# Log the request status and response time
|
||||
logger.debug(f"POST request to {url} completed with status code {response.status_code}. "
|
||||
f"Response time: {response.elapsed.total_seconds()} seconds.")
|
||||
|
||||
response.raise_for_status() # Raise an error for bad status codes
|
||||
|
||||
# Extract token from the response
|
||||
token = response.json().get('token')
|
||||
|
||||
if token:
|
||||
logger.info("Token generated successfully.")
|
||||
else:
|
||||
logger.error("Token not found in the response.")
|
||||
sys.exit(1)
|
||||
|
||||
token = response.json()['token']
|
||||
logger.info("Token generated successfully.")
|
||||
return token
|
||||
|
||||
except requests.exceptions.RequestException as e:
|
||||
logger.error(f"Error generating token for username '{username}': {e}")
|
||||
sys.exit(1)
|
||||
except KeyError as e:
|
||||
logger.error(f"Error extracting token from the response: Missing key {e}")
|
||||
sys.exit(1)
|
||||
except Exception as e:
|
||||
logger.exception(f"Unexpected error generating token for username '{username}': {e}")
|
||||
logger.error(f"Error generating token: {e}")
|
||||
sys.exit(1)
|
||||
|
||||
def truncate(token, hostname, instance, fs, layer, secure=True):
|
||||
@ -207,17 +113,10 @@ def truncate(token, hostname, instance, fs, layer, secure=True):
|
||||
url = f"{protocol}{hostname}/{instance}/arcgis/rest/admin/services/{fs}/FeatureServer/{layer}/truncate?token={token}&async=true&f=json"
|
||||
|
||||
try:
|
||||
# Attempt the POST request
|
||||
logging.info(f"Attempting to truncate layer {layer} on {hostname}...")
|
||||
|
||||
# Debug logging for the URL being used
|
||||
logging.debug(f"Truncate URL: {url}")
|
||||
|
||||
response = requests.post(url, timeout=30)
|
||||
|
||||
# Log response time
|
||||
logging.debug(f"POST request to {url} completed with status code {response.status_code}. "
|
||||
f"Response time: {response.elapsed.total_seconds()} seconds.")
|
||||
|
||||
# Check for HTTP errors
|
||||
response.raise_for_status() # Raise an exception for HTTP errors (4xx, 5xx)
|
||||
|
||||
@ -225,30 +124,28 @@ def truncate(token, hostname, instance, fs, layer, secure=True):
|
||||
if response.status_code == 200:
|
||||
result = response.json()
|
||||
if 'error' in result:
|
||||
logging.error(f"Error truncating layer {layer}: {result['error']}")
|
||||
logging.error(f"Error truncating layer: {result['error']}")
|
||||
return None
|
||||
logging.info(f"Successfully truncated layer: {protocol}{hostname}/{instance}/arcgis/rest/admin/services/{fs}/FeatureServer/{layer}.")
|
||||
return result
|
||||
else:
|
||||
logging.error(f"Unexpected response for layer {layer}: {response.status_code} - {response.text}")
|
||||
logging.error(f"Unexpected response: {response.status_code} - {response.text}")
|
||||
return None
|
||||
|
||||
except requests.exceptions.Timeout as e:
|
||||
logging.error(f"Request timed out while truncating layer {layer}: {e}")
|
||||
return None
|
||||
except requests.exceptions.RequestException as e:
|
||||
logging.error(f"Request failed while truncating layer {layer}: {e}")
|
||||
# Catch network-related errors, timeouts, etc.
|
||||
logging.error(f"Request failed: {e}")
|
||||
return None
|
||||
except Exception as e:
|
||||
logging.error(f"An unexpected error occurred while truncating layer {layer}: {e}")
|
||||
# Catch any other unexpected errors
|
||||
logging.error(f"An unexpected error occurred: {e}")
|
||||
return None
|
||||
|
||||
def add_features(token, hostname, instance, fs, layer, aggregated_data, secure=True):
|
||||
"""Add features to a feature service."""
|
||||
protocol = 'https://' if secure else 'http://'
|
||||
url = f"{protocol}{hostname}/{instance}/arcgis/rest/services/{fs}/FeatureServer/{layer}/addFeatures?token={token}&rollbackOnFailure=true&f=json"
|
||||
|
||||
logger.info(f"Attempting to add features to {protocol}{hostname}/{instance}/arcgis/rest/services/{fs}/FeatureServer/{layer}...")
|
||||
logger.info(f"Attempting to add features on {protocol}{hostname}/{instance}/arcgis/rest/services/{fs}/FeatureServer/{layer}...")
|
||||
|
||||
# Prepare features data as the payload
|
||||
features_json = json.dumps(aggregated_data) # Convert aggregated data to JSON string
|
||||
@ -262,146 +159,73 @@ def add_features(token, hostname, instance, fs, layer, aggregated_data, secure=T
|
||||
}
|
||||
|
||||
try:
|
||||
# Log request details (but avoid logging sensitive data)
|
||||
logger.debug(f"Request URL: {url}")
|
||||
logger.debug(f"Payload size: {len(features_json)} characters")
|
||||
|
||||
response = requests.post(url, headers=headers, data=payload, timeout=180)
|
||||
|
||||
# Log the response time and status code
|
||||
logger.debug(f"POST request to {url} completed with status code {response.status_code}. "
|
||||
f"Response time: {response.elapsed.total_seconds()} seconds.")
|
||||
|
||||
response.raise_for_status() # Raise an error for bad status codes
|
||||
|
||||
logger.info("Features added successfully.")
|
||||
|
||||
# Log any successful response details
|
||||
if response.status_code == 200:
|
||||
logger.debug(f"Response JSON size: {len(response.text)} characters.")
|
||||
|
||||
return response.json()
|
||||
|
||||
except requests.exceptions.Timeout as e:
|
||||
logger.error(f"Request timed out while adding features: {e}")
|
||||
return {'error': 'Request timed out'}
|
||||
|
||||
except requests.exceptions.RequestException as e:
|
||||
logger.error(f"Request error occurred while adding features: {e}")
|
||||
logger.error(f"Request error: {e}")
|
||||
return {'error': str(e)}
|
||||
|
||||
except json.JSONDecodeError as e:
|
||||
logger.error(f"Error decoding JSON response while adding features: {e}")
|
||||
logger.error(f"Error decoding JSON response: {e}")
|
||||
return {'error': 'Invalid JSON response'}
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"An unexpected error occurred while adding features: {e}")
|
||||
return {'error': str(e)}
|
||||
|
||||
def main():
|
||||
"""Main entry point for the script."""
|
||||
start_time = time.time()
|
||||
# Parse command-line arguments
|
||||
results_per_page = parse_arguments()
|
||||
|
||||
try:
|
||||
logger.info("Starting script execution.")
|
||||
load_dotenv("753DataSync.env")
|
||||
api_url = os.getenv('API_URL')
|
||||
|
||||
# Check and purge old files before processing
|
||||
purge_days = int(os.getenv("PURGE_DAYS", 30)) # Default to 30 days if not set
|
||||
logger.info(f"Purging files older than {purge_days} days.")
|
||||
purge_old_files(purge_days)
|
||||
# Generate the token
|
||||
username = os.getenv('AGOL_USER')
|
||||
password = os.getenv('AGOL_PASSWORD')
|
||||
token = generate_token(username, password)
|
||||
|
||||
# Parse command-line arguments
|
||||
results_per_page, test_mode, reload_file = parse_arguments()
|
||||
logger.info(f"Parsed arguments: results_per_page={results_per_page}, test_mode={test_mode}, reload_file={reload_file}")
|
||||
# Set ArcGIS host details
|
||||
hostname = os.getenv('HOSTNAME')
|
||||
instance = os.getenv('INSTANCE')
|
||||
fs = os.getenv('FS')
|
||||
layer = os.getenv('LAYER')
|
||||
|
||||
# Load environment variables
|
||||
logger.info("Loading environment variables.")
|
||||
load_dotenv("753DataSync.env")
|
||||
api_url = os.getenv('API_URL')
|
||||
if not api_url:
|
||||
logger.error("API_URL environment variable not found.")
|
||||
return
|
||||
# Truncate the layer before adding new features
|
||||
truncate(token, hostname, instance, fs, layer)
|
||||
|
||||
# Generate the token
|
||||
username = os.getenv('AGOL_USER')
|
||||
password = os.getenv('AGOL_PASSWORD')
|
||||
if not username or not password:
|
||||
logger.error("Missing AGOL_USER or AGOL_PASSWORD in environment variables.")
|
||||
return
|
||||
token = generate_token(username, password)
|
||||
all_data = []
|
||||
page_number = 1
|
||||
|
||||
# Set ArcGIS host details
|
||||
hostname = os.getenv('HOSTNAME')
|
||||
instance = os.getenv('INSTANCE')
|
||||
fs = os.getenv('FS')
|
||||
layer = os.getenv('LAYER')
|
||||
while True:
|
||||
# Fetch data from the API
|
||||
data = fetch_data(api_url, page_number, results_per_page)
|
||||
|
||||
logger.info("Truncating the feature layer.")
|
||||
truncate(token, hostname, instance, fs, layer)
|
||||
# Append features data to the aggregated list
|
||||
all_data.extend(data) # Data is now a list of features
|
||||
|
||||
# If --reload flag is set, load data from the specified file
|
||||
if reload_file:
|
||||
logger.info(f"Reloading data from file: {reload_file}")
|
||||
# Generate filename with timestamp for the individual page
|
||||
timestamp = datetime.now().strftime("%Y-%m-%d_%H-%M-%S")
|
||||
page_filename = f"data/enforcement_page_{page_number}_results_{results_per_page}_{timestamp}.json"
|
||||
|
||||
# Load data from the specified file
|
||||
with open(reload_file, 'r', encoding='utf-8') as f:
|
||||
aggregated_data = json.load(f)
|
||||
# Save individual page data
|
||||
save_json(data, page_filename)
|
||||
|
||||
# Add the features to the feature layer
|
||||
response = add_features(token, hostname, instance, fs, layer, aggregated_data)
|
||||
logger.info("Data reloaded successfully from the specified file.")
|
||||
return
|
||||
# Check if the number of records is less than the results_per_page, indicating last page
|
||||
if len(data) < results_per_page:
|
||||
logger.info("No more data to fetch, stopping pagination.")
|
||||
break
|
||||
|
||||
all_data = []
|
||||
page_number = 1
|
||||
page_number += 1
|
||||
|
||||
while True:
|
||||
try:
|
||||
# Fetch data from the API
|
||||
data = fetch_data(api_url, page_number, results_per_page)
|
||||
# Prepare aggregated data
|
||||
aggregated_data = all_data # Just use the collected features directly
|
||||
|
||||
# Append features data to the aggregated list
|
||||
all_data.extend(data)
|
||||
# Save aggregated data to a single JSON file
|
||||
aggregated_filename = f"data/aggregated_enforcement_results_{timestamp}.json"
|
||||
save_json(aggregated_data, aggregated_filename)
|
||||
|
||||
timestamp = datetime.now().strftime("%Y-%m-%d_%H-%M-%S")
|
||||
page_filename = f"data/enforcement_page_{page_number}_results_{results_per_page}_{timestamp}.json"
|
||||
|
||||
# Save individual page data if in DEBUG mode
|
||||
if log_level == 'DEBUG':
|
||||
save_json(data, page_filename)
|
||||
|
||||
# Stop if last page
|
||||
if len(data) < results_per_page:
|
||||
logger.info("No more data to fetch, stopping pagination.")
|
||||
break
|
||||
|
||||
# Break the loop if in test mode
|
||||
if test_mode:
|
||||
logger.info("Test mode is enabled, stopping after the first page.")
|
||||
break
|
||||
|
||||
page_number += 1
|
||||
except Exception as e:
|
||||
logger.error(f"Error fetching or saving data for page {page_number}: {e}", exc_info=True)
|
||||
break
|
||||
|
||||
# Prepare aggregated data
|
||||
aggregated_data = all_data
|
||||
|
||||
# Save aggregated data
|
||||
aggregated_filename = f"data/aggregated_enforcement_results_{timestamp}.json"
|
||||
logger.info(f"Saving aggregated data to {aggregated_filename}.")
|
||||
save_json(aggregated_data, aggregated_filename)
|
||||
|
||||
# Add the features to the feature layer
|
||||
response = add_features(token, hostname, instance, fs, layer, aggregated_data)
|
||||
|
||||
except Exception as e:
|
||||
logger.error(f"An unexpected error occurred: {e}", exc_info=True)
|
||||
return
|
||||
finally:
|
||||
elapsed_time = timedelta(seconds=time.time() - start_time)
|
||||
logger.info(f"Script execution completed in {str(elapsed_time)}.")
|
||||
# Add the features to the feature layer
|
||||
response = add_features(token, hostname, instance, fs, layer, aggregated_data)
|
||||
logger.info(f"Add features response: {json.dumps(response, indent=2)}")
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
Loading…
Reference in New Issue
Block a user