notes and insights
notes and insights
$env:Path += ";C:\Users\vashi\AppData\Roaming\Python\Scripts"
this is the commadn you use for activating the poetry in your env
===================================================================================
======
we have setup the kernel using poetry for the venv you ahve to istall teh
depnedencises and
the kernels for the poetry to be used i will paste all teh command and resue this
file for the poetry
lock to be used in teh process.
we have successfully created a poetry and the req in the poetry to write somtiung
in teh command we have to use
this command
& "$(poetry env info --path)\Scripts\Activate.ps1"
===================================================================================
============
cicids_ldap_realtime_ml/
├── data/
│ ├── raw/
│ │ └── DrDoS_LDAP.parquet # Original dataset
│ ├── processed/
│ │ └── cleaned_data.csv # Cleaned, preprocessed CSV
│ └── predictions/
│ └── predictions.csv # Model predictions (optional log)
│
├── models/
│ └── ldap_model.pkl # Trained ML model
│
├── src/
│ ├── preprocessing.py # Feature cleaning/engineering
│ ├── train_model.py # Model training
│ ├── inference.py # Loads model, makes predictions
│ ├── stream_simulator.py # Simulates real-time data from CSV
│ └── influx_writer.py # Writes predictions to InfluxDB
│
├── api/
│ └── fastapi_server.py # Optional FastAPI service for model
inference
│
├── grafana/
│ ├── dashboards/
│ │ └── ldap_dashboard.json # Prebuilt Grafana dashboard (exported
JSON)
│ ├── docker-compose.yml # Grafana + InfluxDB setup
│ └── datasource_config.yaml # InfluxDB datasource for Grafana
│
├── notebooks/
│ └── eda_feature_analysis.ipynb # Exploratory data analysis
│
├── scripts/
│ └── run_pipeline.py # Orchestrates: simulate → predict →
write to Influx
│
├── requirements.txt # Python dependencies
├── README.md # Project overview and how to run
└── .gitignore # Ignore model files, logs, __pycache__,
etc.
S