0% found this document useful (0 votes)
2 views

notes and insights

The document provides instructions for setting up a Python environment using Poetry, including commands for activating the environment and managing dependencies. It outlines the project structure for a machine learning application, detailing directories for data, models, source code, APIs, and dashboards. Additionally, it includes a README for project overview and a requirements file for Python dependencies.
Copyright
© © All Rights Reserved
Available Formats
Download as TXT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
2 views

notes and insights

The document provides instructions for setting up a Python environment using Poetry, including commands for activating the environment and managing dependencies. It outlines the project structure for a machine learning application, detailing directories for data, models, source code, APIs, and dashboards. Additionally, it includes a README for project overview and a requirements file for Python dependencies.
Copyright
© © All Rights Reserved
Available Formats
Download as TXT, PDF, TXT or read online on Scribd
You are on page 1/ 2

this is the commadn that i found is usefull for the poetry liock

$env:Path += ";C:\Users\vashi\AppData\Roaming\Python\Scripts"

this is the commadn you use for activating the poetry in your env

& "$(poetry env info --path)\Scripts\Activate.ps1"

===================================================================================
======

we have setup the kernel using poetry for the venv you ahve to istall teh
depnedencises and
the kernels for the poetry to be used i will paste all teh command and resue this
file for the poetry
lock to be used in teh process.

we have successfully created a poetry and the req in the poetry to write somtiung
in teh command we have to use
this command
& "$(poetry env info --path)\Scripts\Activate.ps1"
===================================================================================
============

cicids_ldap_realtime_ml/
├── data/
│ ├── raw/
│ │ └── DrDoS_LDAP.parquet # Original dataset
│ ├── processed/
│ │ └── cleaned_data.csv # Cleaned, preprocessed CSV
│ └── predictions/
│ └── predictions.csv # Model predictions (optional log)

├── models/
│ └── ldap_model.pkl # Trained ML model

├── src/
│ ├── preprocessing.py # Feature cleaning/engineering
│ ├── train_model.py # Model training
│ ├── inference.py # Loads model, makes predictions
│ ├── stream_simulator.py # Simulates real-time data from CSV
│ └── influx_writer.py # Writes predictions to InfluxDB

├── api/
│ └── fastapi_server.py # Optional FastAPI service for model
inference

├── grafana/
│ ├── dashboards/
│ │ └── ldap_dashboard.json # Prebuilt Grafana dashboard (exported
JSON)
│ ├── docker-compose.yml # Grafana + InfluxDB setup
│ └── datasource_config.yaml # InfluxDB datasource for Grafana

├── notebooks/
│ └── eda_feature_analysis.ipynb # Exploratory data analysis

├── scripts/
│ └── run_pipeline.py # Orchestrates: simulate → predict →
write to Influx

├── requirements.txt # Python dependencies
├── README.md # Project overview and how to run
└── .gitignore # Ignore model files, logs, __pycache__,
etc.
S

You might also like