Compare commits
No commits in common. "master" and "v2025.04" have entirely different histories.
215
README.md
215
README.md
@ -1,195 +1,122 @@
|
|||||||

|

|
||||||
|
|
||||||
# 753 Data Sync
|
# 753 Data Sync
|
||||||
|

|
||||||
|
|
||||||
*A Python-based data ingestion tool for syncing enforcement data from a public API to ArcGIS Online.*
|
This script fetches enforcement data from an external API, truncates a specified feature layer in ArcGIS, and adds the fetched data as features to the layer. The script performs the following tasks:
|
||||||
|
|
||||||

|
- **Truncate** the specified layer in ArcGIS to clear any previous features before adding new ones.
|
||||||

|
- **Fetch** data from an API in paginated form.
|
||||||

|
- **Save** data from each API response to individual JSON files.
|
||||||
|
- **Aggregate** all data from all pages into one JSON file.
|
||||||
|
- **Add** the aggregated data as features to an ArcGIS feature service.
|
||||||
|
|
||||||
---
|
## Requirements
|
||||||
|
|
||||||
## 🚀 Overview
|
- Python 3.6 or higher
|
||||||
|
- Required Python packages (see `requirements.txt`)
|
||||||
|
- ArcGIS Online credentials (username and password)
|
||||||
|
- `.env` file for configuration (see below for details)
|
||||||
|
|
||||||
This script fetches enforcement data from an external API, truncates a specified feature layer in ArcGIS, and adds the fetched data as features to the layer. It also logs the operation, saves data to JSON files, and optionally purges old files. Additionally, it supports reloading data from a JSON file without making API calls.
|
## Install Dependencies
|
||||||
|
|
||||||
---
|
To install the required dependencies, use the following command:
|
||||||
|
|
||||||
## 📦 Requirements
|
|
||||||
|
|
||||||
- Python 3.6 or higher (if using the Python script)
|
|
||||||
- Required packages in `requirements.txt`
|
|
||||||
- `.env` file with your configuration
|
|
||||||
- ArcGIS Online credentials
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## ⚙️ Installation
|
|
||||||
|
|
||||||
### Python Script
|
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
pip install -r requirements.txt
|
pip install -r requirements.txt
|
||||||
```
|
```
|
||||||
|
|
||||||
Or install packages individually:
|
Alternatively, you can install the necessary packages individually:
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
pip install requests python-dotenv
|
pip install requests
|
||||||
|
pip install python-dotenv
|
||||||
```
|
```
|
||||||
|
|
||||||
### Windows Executable
|
## Configuration
|
||||||
|
|
||||||
A Windows executable is available for users who prefer not to run the script directly. You can download it from the [releases page](https://git.nickhepler.cloud/nick/753-Data-Sync/releases). This executable is compiled using PyInstaller and can be run without needing to install Python or any dependencies.
|
Before running the script, you need to configure some environment variables. Create a `.env` file in the root of your project with the following details:
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## ⚙️ Configuration
|
|
||||||
|
|
||||||
Create a `.env` file in the root of your project:
|
|
||||||
|
|
||||||
```env
|
```env
|
||||||
API_URL=https://example.com/api
|
API_URL=your_api_url
|
||||||
AGOL_USER=your_username
|
AGOL_USER=your_arcgis_online_username
|
||||||
AGOL_PASSWORD=your_password
|
AGOL_PASSWORD=your_arcgis_online_password
|
||||||
HOSTNAME=www.arcgis.com
|
HOSTNAME=your_arcgis_host
|
||||||
INSTANCE=your_instance
|
INSTANCE=your_arcgis_instance
|
||||||
FS=your_feature_service
|
FS=your_feature_service
|
||||||
LAYER=0
|
LAYER=your_layer_id
|
||||||
LOG_LEVEL=DEBUG
|
LOG_LEVEL=your_log_level # e.g., DEBUG, INFO, WARNING, ERROR, CRITICAL
|
||||||
PURGE_DAYS=5
|
|
||||||
```
|
```
|
||||||
|
|
||||||
### Required Variables
|
### Environment Variables:
|
||||||
|
|
||||||
| Variable | Description |
|
- **API_URL**: The URL of the API you are fetching data from.
|
||||||
|----------------|--------------------------------------------|
|
- **AGOL_USER**: Your ArcGIS Online username.
|
||||||
| `API_URL` | The API endpoint to fetch data from |
|
- **AGOL_PASSWORD**: Your ArcGIS Online password.
|
||||||
| `AGOL_USER` | ArcGIS Online username |
|
- **HOSTNAME**: The hostname of your ArcGIS Online instance (e.g., `www.arcgis.com`).
|
||||||
| `AGOL_PASSWORD`| ArcGIS Online password |
|
- **INSTANCE**: The instance name of your ArcGIS Online service.
|
||||||
| `HOSTNAME` | ArcGIS host (e.g., `www.arcgis.com`) |
|
- **FS**: The name of the feature service you are working with.
|
||||||
| `INSTANCE` | ArcGIS REST instance path |
|
- **LAYER**: The ID or name of the layer to truncate and add features to.
|
||||||
| `FS` | Feature service name |
|
- **LOG_LEVEL**: The desired logging level (e.g., `DEBUG`, `INFO`, `WARNING`, `ERROR`, `CRITICAL`).
|
||||||
| `LAYER` | Feature layer ID or name |
|
|
||||||
|
|
||||||
### Optional Variables
|
## Script Usage
|
||||||
|
|
||||||
| Variable | Description |
|
You can run the script with the following command:
|
||||||
|----------------|--------------------------------------------|
|
|
||||||
| `LOG_LEVEL` | Log level (`DEBUG`, `INFO`, etc.) |
|
|
||||||
| `PURGE_DAYS` | Number of days to retain logs and JSONs |
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## 🧪 Script Usage
|
|
||||||
|
|
||||||
### Python Script
|
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
python 753DataSync.py --results_per_page 100
|
python 753DataSync.py --results_per_page <number_of_results_per_page>
|
||||||
```
|
```
|
||||||
|
|
||||||
### Windows Executable
|
### Arguments:
|
||||||
|
|
||||||
Simply double-click the executable file to run it. You can also run it from the command line with:
|
- `--results_per_page` (optional): The number of results to fetch per page (default: 100).
|
||||||
|
|
||||||
```bash
|
## Functionality
|
||||||
753DataSync.exe --results_per_page 100
|
|
||||||
```
|
|
||||||
|
|
||||||
### CLI Arguments
|
### 1. **Truncate Layer**:
|
||||||
|
Before fetching and adding any new data, the script will call the `truncate` function to clear out any existing features from the specified layer. This ensures that the feature layer is empty and ready for the new data.
|
||||||
|
|
||||||
| Argument | Description |
|
### 2. **Fetch Data**:
|
||||||
|----------------------|---------------------------------------------|
|
The script will then fetch data from the specified API in pages. Each page is fetched sequentially until all data is retrieved.
|
||||||
| `--results_per_page` | Optional. Number of results per API call (default: `100`) |
|
|
||||||
| `--test` | Optional. If set, only fetch the first page of results. |
|
|
||||||
| `--reload` | Optional. Load data from a specified JSON file instead of fetching from the API. |
|
|
||||||
|
|
||||||
---
|
### 3. **Save Data**:
|
||||||
|
Data from each page will be saved to an individual JSON file, with the filename including the page number and timestamp. The aggregated data (all pages combined) is saved to a separate file.
|
||||||
|
|
||||||
## 📋 Functionality
|
### 4. **Add Features**:
|
||||||
|
After all the data has been fetched and saved, the script will send the aggregated data as features to the specified ArcGIS feature layer.
|
||||||
|
|
||||||
1. **🔁 Truncate Layer** — Clears existing ArcGIS features.
|
## Example Output
|
||||||
2. **🌐 Fetch Data** — Retrieves paginated data from the API.
|
|
||||||
3. **💾 Save Data** — Writes each page to a time-stamped JSON file.
|
|
||||||
4. **📦 Aggregate Data** — Combines all pages into one file.
|
|
||||||
5. **📤 Add Features** — Sends data to ArcGIS feature layer.
|
|
||||||
6. **🧹 File Cleanup** — Deletes `.json`/`.log` files older than `PURGE_DAYS`.
|
|
||||||
7. **📑 Dynamic Logs** — Logs saved to `753DataSync_YYYY-MM-DD.log`.
|
|
||||||
8. **🧪 Test Mode** — Use the `--test` flag to fetch only the first page of results for testing purposes.
|
|
||||||
9. **🔄 Reload Data** — Use the `--reload` flag to truncate the feature layer and load data from a specified JSON file.
|
|
||||||
|
|
||||||
---
|
- Individual page files are saved in the `data/` directory with filenames like `enforcement_page_1_results_100_2025-03-26_14-30-45.json`.
|
||||||
|
- The aggregated file is saved as `aggregated_enforcement_results_2025-03-26_14-30-45.json`.
|
||||||
|
|
||||||
## 📁 Example Output
|
Logs will also be generated in the `753DataSync.log` file and printed to the console.
|
||||||
|
|
||||||
```bash
|
## Example Output (Log)
|
||||||
📁 data/
|
|
||||||
├── enforcement_page_1_results_100_2025-03-26_14-30-45.json
|
|
||||||
├── enforcement_page_2_results_100_2025-03-26_14-31-10.json
|
|
||||||
└── aggregated_enforcement_results_2025-03-26_14-31-15.json
|
|
||||||
|
|
||||||
📄 753DataSync_2025-03-26.log
|
|
||||||
```
|
|
||||||
---
|
|
||||||
|
|
||||||
## 📝 Example Log
|
|
||||||
|
|
||||||
```text
|
```text
|
||||||
2025-03-26 14:30:45 - INFO - Attempting to truncate layer...
|
2025-03-26 14:30:45 - INFO - Attempting to truncate layer on https://www.arcgis.com/...
|
||||||
2025-03-26 14:30:51 - INFO - Fetching page 1 from API...
|
2025-03-26 14:30:50 - INFO - Successfully truncated layer: https://www.arcgis.com/...
|
||||||
2025-03-26 14:30:55 - INFO - Saved data to data/enforcement_page_1_results_100_...
|
2025-03-26 14:30:51 - INFO - Making request to: https://api.example.com/1/100
|
||||||
2025-03-26 14:30:57 - INFO - Aggregated data saved.
|
2025-03-26 14:30:55 - INFO - Data saved to data/enforcement_page_1_results_100_2025-03-26_14-30-45.json
|
||||||
|
2025-03-26 14:30:56 - INFO - No more data to fetch, stopping pagination.
|
||||||
|
2025-03-26 14:30:57 - INFO - Data saved to data/aggregated_enforcement_results_2025-03-26_14-30-45.json
|
||||||
2025-03-26 14:31:00 - INFO - Features added successfully.
|
2025-03-26 14:31:00 - INFO - Features added successfully.
|
||||||
2025-03-26 14:31:01 - INFO - Deleted old log: 753DataSync_2025-03-19.log
|
|
||||||
```
|
```
|
||||||
|
|
||||||
---
|
## Error Handling
|
||||||
|
|
||||||
## 🛠 Troubleshooting
|
The script handles errors gracefully, including:
|
||||||
|
|
||||||
- Set `LOG_LEVEL=DEBUG` in `.env` for detailed logs.
|
- If an error occurs while fetching data, the script will log the error and stop execution.
|
||||||
- Ensure `.env` has no syntax errors.
|
- If the `truncate` or `add_features` operations fail, the script will log the error and stop execution.
|
||||||
- Make sure your ArcGIS layer has permission for truncation and writes.
|
- The script handles HTTP errors and network-related errors gracefully, ensuring that any issues are logged with detailed information.
|
||||||
- Check for internet/API access and expired ArcGIS tokens.
|
|
||||||
- Logs are written to both console and daily log files.
|
|
||||||
|
|
||||||
---
|
## Troubleshooting
|
||||||
|
|
||||||
## 🧪 Testing
|
- If the script stops unexpectedly, check the logs (`753DataSync.log`) for detailed error information.
|
||||||
|
- Ensure your `.env` file is correctly configured with valid credentials and API URL.
|
||||||
|
- Make sure the specified ArcGIS layer is accessible and has the correct permissions for truncating and adding features.
|
||||||
|
|
||||||
Currently, the script is tested manually. Automated testing may be added under a `/tests` folder in the future.
|
## License
|
||||||
|
|
||||||
---
|
This project is licensed under the GNU General Public License v3.0 or later - see the [LICENSE](LICENSE) file for details.
|
||||||
|
|
||||||
## 📖 Usage Examples
|
|
||||||
|
|
||||||
```bash
|
|
||||||
# Run with default page size
|
|
||||||
python 753DataSync.py
|
|
||||||
|
|
||||||
# Run with custom page size
|
|
||||||
python 753DataSync.py --results_per_page 50
|
|
||||||
|
|
||||||
# Run the Windows executable with default page size
|
|
||||||
753DataSync.exe
|
|
||||||
|
|
||||||
# Run the Windows executable with custom page size
|
|
||||||
753DataSync.exe --results_per_page 50
|
|
||||||
```
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## 💬 Support
|
|
||||||
|
|
||||||
Found a bug or want to request a feature?
|
|
||||||
[Open an issue](https://git.nickhepler.cloud/nick/753-Data-Sync/issues) or contact [@nick](https://git.nickhepler.cloud/nick) directly.
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## 📜 License
|
|
||||||
|
|
||||||
This project is licensed under the [GNU General Public License v3.0](LICENSE).
|
|
||||||
|
|
||||||
> 💡 *You are free to use, modify, and share this project as long as you preserve the same license in your changes.*
|
|
||||||
90
app.py
90
app.py
@ -4,11 +4,9 @@ import sys
|
|||||||
import os
|
import os
|
||||||
import json
|
import json
|
||||||
from datetime import datetime
|
from datetime import datetime
|
||||||
from datetime import timedelta
|
|
||||||
import argparse
|
import argparse
|
||||||
import urllib.parse
|
import urllib.parse
|
||||||
from dotenv import load_dotenv
|
from dotenv import load_dotenv
|
||||||
import time
|
|
||||||
|
|
||||||
# Load environment variables from .env file
|
# Load environment variables from .env file
|
||||||
load_dotenv("753DataSync.env")
|
load_dotenv("753DataSync.env")
|
||||||
@ -17,10 +15,6 @@ load_dotenv("753DataSync.env")
|
|||||||
BASE_URL = "{}/{}/{}"
|
BASE_URL = "{}/{}/{}"
|
||||||
log_level = os.getenv('LOG_LEVEL', 'INFO').upper()
|
log_level = os.getenv('LOG_LEVEL', 'INFO').upper()
|
||||||
|
|
||||||
# Get the current date for dynamic log file naming
|
|
||||||
current_date = datetime.now().strftime("%Y-%m-%d")
|
|
||||||
log_filename = f"753DataSync_{current_date}.log"
|
|
||||||
|
|
||||||
# Setup logging
|
# Setup logging
|
||||||
logger = logging.getLogger()
|
logger = logging.getLogger()
|
||||||
|
|
||||||
@ -38,8 +32,8 @@ elif log_level == 'CRITICAL':
|
|||||||
else:
|
else:
|
||||||
logger.setLevel(logging.INFO)
|
logger.setLevel(logging.INFO)
|
||||||
|
|
||||||
# File handler for dynamic log file
|
# File handler
|
||||||
file_handler = logging.FileHandler(log_filename)
|
file_handler = logging.FileHandler('753DataSync.log')
|
||||||
file_handler.setLevel(getattr(logging, log_level))
|
file_handler.setLevel(getattr(logging, log_level))
|
||||||
|
|
||||||
# Stream handler (console output)
|
# Stream handler (console output)
|
||||||
@ -55,35 +49,6 @@ stream_handler.setFormatter(formatter)
|
|||||||
logger.addHandler(file_handler)
|
logger.addHandler(file_handler)
|
||||||
logger.addHandler(stream_handler)
|
logger.addHandler(stream_handler)
|
||||||
|
|
||||||
def purge_old_files(purge_days):
|
|
||||||
"""Purge log and data files older than PURGE_DAYS from the 'data' folder."""
|
|
||||||
data_folder = 'data'
|
|
||||||
log_folder = '.' # Log files are in the current directory
|
|
||||||
|
|
||||||
if not os.path.exists(data_folder):
|
|
||||||
logger.warning(f"The '{data_folder}' folder does not exist.")
|
|
||||||
return
|
|
||||||
|
|
||||||
purge_threshold = datetime.now() - timedelta(days=purge_days)
|
|
||||||
|
|
||||||
# Delete old log files
|
|
||||||
for filename in os.listdir(log_folder):
|
|
||||||
if filename.endswith(".log"):
|
|
||||||
file_path = os.path.join(log_folder, filename)
|
|
||||||
file_modified_time = datetime.fromtimestamp(os.path.getmtime(file_path))
|
|
||||||
if file_modified_time < purge_threshold:
|
|
||||||
logger.info(f"Deleting old log file: {file_path}")
|
|
||||||
os.remove(file_path)
|
|
||||||
|
|
||||||
# Delete old data files
|
|
||||||
for filename in os.listdir(data_folder):
|
|
||||||
file_path = os.path.join(data_folder, filename)
|
|
||||||
if filename.endswith(".json"):
|
|
||||||
file_modified_time = datetime.fromtimestamp(os.path.getmtime(file_path))
|
|
||||||
if file_modified_time < purge_threshold:
|
|
||||||
logger.info(f"Deleting old data file: {file_path}")
|
|
||||||
os.remove(file_path)
|
|
||||||
|
|
||||||
def fetch_data(api_url, page_number, results_per_page):
|
def fetch_data(api_url, page_number, results_per_page):
|
||||||
"""Fetches data from the API and returns the response."""
|
"""Fetches data from the API and returns the response."""
|
||||||
url = BASE_URL.format(api_url, page_number, results_per_page)
|
url = BASE_URL.format(api_url, page_number, results_per_page)
|
||||||
@ -146,16 +111,10 @@ def parse_arguments():
|
|||||||
# Add arguments for results per page
|
# Add arguments for results per page
|
||||||
parser.add_argument('--results_per_page', type=int, default=100, help="Number of results per page (default: 100)")
|
parser.add_argument('--results_per_page', type=int, default=100, help="Number of results per page (default: 100)")
|
||||||
|
|
||||||
# Add a test flag
|
|
||||||
parser.add_argument('--test', action='store_true', help="If set, only fetch the first page of results.")
|
|
||||||
|
|
||||||
# Add a reload flag
|
|
||||||
parser.add_argument('--reload', type=str, help="If set, load data from the specified file instead of fetching from the API.")
|
|
||||||
|
|
||||||
# Parse the arguments
|
# Parse the arguments
|
||||||
args = parser.parse_args()
|
args = parser.parse_args()
|
||||||
|
|
||||||
return args.results_per_page, args.test, args.reload
|
return args.results_per_page
|
||||||
|
|
||||||
def generate_token(username, password, url="https://www.arcgis.com/sharing/rest/generateToken"):
|
def generate_token(username, password, url="https://www.arcgis.com/sharing/rest/generateToken"):
|
||||||
"""Generates an authentication token."""
|
"""Generates an authentication token."""
|
||||||
@ -300,19 +259,12 @@ def add_features(token, hostname, instance, fs, layer, aggregated_data, secure=T
|
|||||||
|
|
||||||
def main():
|
def main():
|
||||||
"""Main entry point for the script."""
|
"""Main entry point for the script."""
|
||||||
start_time = time.time()
|
|
||||||
|
|
||||||
try:
|
try:
|
||||||
logger.info("Starting script execution.")
|
logger.info("Starting script execution.")
|
||||||
|
|
||||||
# Check and purge old files before processing
|
|
||||||
purge_days = int(os.getenv("PURGE_DAYS", 30)) # Default to 30 days if not set
|
|
||||||
logger.info(f"Purging files older than {purge_days} days.")
|
|
||||||
purge_old_files(purge_days)
|
|
||||||
|
|
||||||
# Parse command-line arguments
|
# Parse command-line arguments
|
||||||
results_per_page, test_mode, reload_file = parse_arguments()
|
results_per_page = parse_arguments()
|
||||||
logger.info(f"Parsed arguments: results_per_page={results_per_page}, test_mode={test_mode}, reload_file={reload_file}")
|
logger.info(f"Parsed arguments: results_per_page={results_per_page}")
|
||||||
|
|
||||||
# Load environment variables
|
# Load environment variables
|
||||||
logger.info("Loading environment variables.")
|
logger.info("Loading environment variables.")
|
||||||
@ -336,22 +288,9 @@ def main():
|
|||||||
fs = os.getenv('FS')
|
fs = os.getenv('FS')
|
||||||
layer = os.getenv('LAYER')
|
layer = os.getenv('LAYER')
|
||||||
|
|
||||||
logger.info("Truncating the feature layer.")
|
# Truncate the layer before adding new features
|
||||||
truncate(token, hostname, instance, fs, layer)
|
truncate(token, hostname, instance, fs, layer)
|
||||||
|
|
||||||
# If --reload flag is set, load data from the specified file
|
|
||||||
if reload_file:
|
|
||||||
logger.info(f"Reloading data from file: {reload_file}")
|
|
||||||
|
|
||||||
# Load data from the specified file
|
|
||||||
with open(reload_file, 'r', encoding='utf-8') as f:
|
|
||||||
aggregated_data = json.load(f)
|
|
||||||
|
|
||||||
# Add the features to the feature layer
|
|
||||||
response = add_features(token, hostname, instance, fs, layer, aggregated_data)
|
|
||||||
logger.info("Data reloaded successfully from the specified file.")
|
|
||||||
return
|
|
||||||
|
|
||||||
all_data = []
|
all_data = []
|
||||||
page_number = 1
|
page_number = 1
|
||||||
|
|
||||||
@ -366,42 +305,35 @@ def main():
|
|||||||
timestamp = datetime.now().strftime("%Y-%m-%d_%H-%M-%S")
|
timestamp = datetime.now().strftime("%Y-%m-%d_%H-%M-%S")
|
||||||
page_filename = f"data/enforcement_page_{page_number}_results_{results_per_page}_{timestamp}.json"
|
page_filename = f"data/enforcement_page_{page_number}_results_{results_per_page}_{timestamp}.json"
|
||||||
|
|
||||||
# Save individual page data if in DEBUG mode
|
# Save individual page data
|
||||||
if log_level == 'DEBUG':
|
if log_level == 'DEBUG':
|
||||||
save_json(data, page_filename)
|
save_json(data, page_filename)
|
||||||
|
|
||||||
# Stop if last page
|
# Check if the number of records is less than the results_per_page, indicating last page
|
||||||
if len(data) < results_per_page:
|
if len(data) < results_per_page:
|
||||||
logger.info("No more data to fetch, stopping pagination.")
|
logger.info("No more data to fetch, stopping pagination.")
|
||||||
break
|
break
|
||||||
|
|
||||||
# Break the loop if in test mode
|
|
||||||
if test_mode:
|
|
||||||
logger.info("Test mode is enabled, stopping after the first page.")
|
|
||||||
break
|
|
||||||
|
|
||||||
page_number += 1
|
page_number += 1
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.error(f"Error fetching or saving data for page {page_number}: {e}", exc_info=True)
|
logger.error(f"Error fetching or saving data for page {page_number}: {e}", exc_info=True)
|
||||||
break
|
break
|
||||||
|
|
||||||
# Prepare aggregated data
|
# Prepare aggregated data
|
||||||
aggregated_data = all_data
|
aggregated_data = all_data # Just use the collected features directly
|
||||||
|
|
||||||
# Save aggregated data
|
# Save aggregated data to a single JSON file
|
||||||
aggregated_filename = f"data/aggregated_enforcement_results_{timestamp}.json"
|
aggregated_filename = f"data/aggregated_enforcement_results_{timestamp}.json"
|
||||||
logger.info(f"Saving aggregated data to {aggregated_filename}.")
|
logger.info(f"Saving aggregated data to {aggregated_filename}.")
|
||||||
save_json(aggregated_data, aggregated_filename)
|
save_json(aggregated_data, aggregated_filename)
|
||||||
|
|
||||||
# Add the features to the feature layer
|
# Add the features to the feature layer
|
||||||
response = add_features(token, hostname, instance, fs, layer, aggregated_data)
|
response = add_features(token, hostname, instance, fs, layer, aggregated_data)
|
||||||
|
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
logger.error(f"An unexpected error occurred: {e}", exc_info=True)
|
logger.error(f"An unexpected error occurred: {e}", exc_info=True)
|
||||||
return
|
return
|
||||||
finally:
|
finally:
|
||||||
elapsed_time = timedelta(seconds=time.time() - start_time)
|
logger.info("Script execution completed.")
|
||||||
logger.info(f"Script execution completed in {str(elapsed_time)}.")
|
|
||||||
|
|
||||||
if __name__ == "__main__":
|
if __name__ == "__main__":
|
||||||
main()
|
main()
|
||||||
Loading…
Reference in New Issue
Block a user