Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
176 changes: 120 additions & 56 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,86 +1,150 @@
# PartiNet
PartiNet is a particle-picking pipeline for cryo-EM micrographs. It provides denoising, adaptive detection, and STAR file generation for downstream processing.
# PartiNet 🔬

# Links
- Documentation: https://mihinp.github.io/partinet_documentation/
- Model weights (Hugging Face): https://huggingface.co/MihinP/PartiNet
PartiNet is a three-stage pipeline for automated particle picking in cryo-EM micrographs, combining advanced denoising with state-of-the-art deep learning detection.

# Getting started (quick)
1. Clone the repository

```powershell
## Features

- 🧹 Advanced denoising for improved signal-to-noise ratio
- 🎯 Deep learning-based particle detection
- ⚡ Multi-GPU support for faster processing
- 🔄 Seamless integration with RELION workflows
- 📊 Confidence-based particle filtering
- 🖼️ Visual detection validation

## Prerequisites

Before starting, ensure you have:
- Motion-corrected micrographs
- GPU access (recommended)
- PartiNet installation (see Installation section)

## Installation

```bash
git clone git@github.com:WEHI-ResearchComputing/PartiNet.git
cd PartiNet
```

2. Create a Python virtual environment (recommended)
Alternatively, use our containers:

```bash
# Docker
docker run ghcr.io/wehi-researchcomputing/partinet:latest

```powershell
python -m venv .venv
.\.venv\Scripts\Activate.ps1
pip install -U pip
# Singularity/Apptainer
singularity run oras://ghcr.io/wehi-researchcomputing/partinet:latest
```

3. Install requirements
## Directory Structure

```powershell
pip install -r requirements.txt
# or editable install for development
pip install -e .
```

4. Download model weights (see Hugging Face README)

```powershell
# If you have git-lfs and access via HTTPS/SSH
git lfs install
git clone https://huggingface.co/MihinP/PartiNet
# or use the huggingface_hub python client
python -m pip install huggingface_hub
python - <<'PY'
from huggingface_hub import hf_hub_download
hf_hub_download(repo_id="MihinP/PartiNet", filename="best.pt", repo_type="model")
PY
project_directory/
├── motion_corrected/ # 📁 Input micrographs
├── denoised/ # 🧹 Denoised outputs
├── exp/ # 🎯 Detection results
│ ├── labels/ # 📋 Coordinates
│ └── ... # 🖼️ Visualizations
└── partinet_particles.star # ⭐ Final output
```

# Quick usage examples
## Pipeline Stages

- Denoise images

```powershell
partinet denoise --source /data/raw_micrographs --project /data/partinet_project
### 1. Denoise
```bash
partinet denoise \
--source /data/my_project/motion_corrected \
--project /data/my_project
```

- Detect particles
### 2. Detect
```bash
partinet detect \
--weight /path/to/model_weights.pt \
--source /data/partinet_picking/denoised \
--device 0,1,2,3 \
--project /data/partinet_picking
```

```powershell
partinet detect --weight /path/to/best.pt --source /data/partinet_project/denoised --project /data/partinet_project
### 3. Generate STAR File
```bash
partinet star \
--labels /data/my_project/exp/labels \
--images /data/my_project/denoised \
--output /data/my_project/partinet_particles.star \
--conf 0.1
```

- Generate STAR files
## Key Parameters

```powershell
partinet star --project /data/partinet_project --output /data/partinet_project/exp/particles.star
```
### Detection
- `--backbone-detector`: Choice of neural network architecture
- `--weight`: Path to model weights
- `--conf-thres`: Detection confidence threshold
- `--iou-thres`: Overlap filtering threshold
- `--device`: GPU device selection

# Containerized usage
### STAR Generation
- `--conf`: Confidence threshold for particle filtering
- `--output`: Path for final STAR file

- Docker
## Output Files

```powershell
docker run --gpus all -v /data:/data ghcr.io/wehi-researchcomputing/partinet:main partinet detect --weight /path/to/best.pt --source /data/denoised --project /data/partinet_project
```
1. **Denoised Micrographs** (`denoised/*.mrc`)
- Cleaned micrographs with improved SNR

2. **Detection Results** (`exp/`)
- `labels/*.txt`: Particle coordinates
- `*.png`: Visualization overlays

3. **STAR File** (`partinet_particles.star`)
- Ready for RELION processing

- Apptainer / Singularity
## Advanced Usage

```powershell
apptainer exec --nv --no-home -B /data oras://ghcr.io/wehi-researchcomputing/partinet:main-singularity partinet detect --weight /path/to/best.pt --source /data/denoised --project /data/partinet_project
For detailed information about specific commands:

```bash
partinet --help
partinet <command> --help
```

# Development notes
- Tests and CI: see `.github/workflows/` for CI pipelines.
- Contributing: open issues and PRs on the main repo. Use the documentation site for user-facing docs and developer notes.
Available commands:
- `denoise`: Clean input micrographs
- `detect`: Identify particles
- `star`: Generate STAR files
- `train`: Train custom models (step1/step2)
- `test`: Evaluate model performance

## Troubleshooting

- **GPU Issues**
- Verify GPU availability: `nvidia-smi`
- Check CUDA installation
- Ensure proper device selection

- **Path Issues**
- Verify directory permissions
- Check mount points in container setups
- Ensure absolute paths are used

## Contributing

We welcome contributions! Please see our [Contributing Guidelines](CONTRIBUTING.md) for details.

## License

This project is licensed under the terms of the LICENSE file included in the repository.

## Citation

If you use PartiNet in your research, please cite:
```
Citation information will be added upon publication
```

## Support

# Support
- For questions or issues, open an issue in the main repo: https://github.com/WEHI-ResearchComputing/PartiNet/issues
For issues and questions:
- Open an [Issue](https://github.com/WEHI-ResearchComputing/PartiNet/issues)
- Check existing [Discussions](https://github.com/WEHI-ResearchComputing/PartiNet/discussions)
20 changes: 20 additions & 0 deletions docs/.gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,20 @@
# Dependencies
/node_modules

# Production
/build

# Generated files
.docusaurus
.cache-loader

# Misc
.DS_Store
.env.local
.env.development.local
.env.test.local
.env.production.local

npm-debug.log*
yarn-debug.log*
yarn-error.log*
135 changes: 135 additions & 0 deletions docs/docs/getting-started.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,135 @@
---
sidebar_position: 3
---

# Getting Started

This guide walks you through your first PartiNet analysis using the three-stage pipeline. We'll process cryo-EM micrographs from start to finish.

## Prerequisites

Before starting, ensure you have:
- PartiNet installed (see [Installation](installation.md))
- Motion-corrected micrographs in a source directory
- A project directory where outputs will be saved
- GPU access for optimal performance

## Directory Structure

PartiNet expects and creates the following directory structure:

```
project_directory/
├── motion_corrected/ # 📁 Your input micrographs
│ ├── micrograph1.mrc
│ ├── micrograph2.mrc
│ └── ...
├── denoised/ # 🧹 Created by denoise stage
│ ├── micrograph1.mrc
│ ├── micrograph2.mrc
│ └── ...
├── exp/ # 🎯 Created by detect stage
│ ├── labels/ # 📋 Detection coordinates
│ │ ├── micrograph1.txt
│ │ ├── micrograph2.txt
│ │ └── ...
│ ├── micrograph1.png # 🖼️ Micrographs with detections drawn
│ ├── micrograph2.
│ └── ...
└── partinet_particles.star # ⭐ Final STAR file (created by star stage)
```

**Pipeline Flow:**
1. **Input** → `motion_corrected/` (your micrographs)
2. **Stage 1** → `denoised/` (cleaned micrographs)
3. **Stage 2** → `exp*/` (detections + visualizations)
4. **Stage 3** → `*.star` (final particle coordinates)

## Stage 1: Denoise

The first stage removes noise from your micrographs and improves signal-to-noise ratios:

<div class="container-tabs">

```shell title="Local Installation"
partinet denoise \
--source /data/my_project/motion_corrected \
--project /data/my_project
```

</div>

**What this does:**
- Reads micrographs from `motion_corrected/` directory
- Applies denoising algorithms
- Saves cleaned micrographs to `denoised/` directory in your project folder

## Stage 2: Detect

The detection stage identifies particles in your denoised micrographs:

<div class="container-tabs">

```shell title="Local Installation"
partinet detect \
--weight /path/to/downloaded/model_weights.pt \
--source /data/partinet_picking/denoised \
--device 0,1,2,3 \
--project /data/partinet_picking
```

</div>

**What this creates:**
- `exp/` directory in your project folder
- `exp/labels/` directory containing detection coordinates for each micrograph
- Micrographs with detection boxes drawn on top (saved in `exp/`)

**Key parameters:**
- `--backbone-detector`: Neural network architecture to use
- `--weight`: Path to trained model weights
- `--conf-thres`: Confidence threshold for detections (0.0 = accept all)
- `--iou-thres`: Intersection over Union threshold for filtering overlapping detections
- `--device`: GPU devices to use (0,1,2,3 = use 4 GPUs)

## Stage 3: Star

The final stage converts detections to STAR format and applies confidence filtering:

<div class="container-tabs">

```shell title="Local Installation"
partinet star \
--labels /data/my_project/exp/labels \
--images /data/my_project/denoised \
--output /data/my_project/partinet_particles.star \
--conf 0.1
```

</div>

**What this does:**
- Reads detection labels from `exp/labels/`
- Filters particles based on confidence threshold (0.1 in this example)
- Creates a STAR file ready for further processing in RELION or other software

## Output Files

After running all three stages, you'll have:

1. **Denoised micrographs** (`denoised/`) - Cleaned input for particle detection
2. **Detection visualizations** (`exp/*.mrc`) - Micrographs with particle boxes drawn
3. **Detection coordinates** (`exp/labels/*.txt`) - Raw detection data
4. **STAR file** (`*.star`) - Final particle coordinates ready for downstream processing


## Next Steps

- Learn more about individual stages: [Denoise](stages/denoise.md), [Detect](stages/detect.md), [STAR](stages/star.md)

## Troubleshooting

If you encounter issues:
- Ensure all paths exist and are accessible
- Check GPU availability with `nvidia-smi`
- Verify container mounting with `-B` flags includes all necessary paths
Loading