A timing and profiling library for Python with support for context managers, decorators, lap timing, and benchmarking.
- Multiple Timer Interfaces: Choose between
StopWatch,TimerContext, or the registry-basedTimerclass - Context Manager Support: Time code blocks with clean
withstatements - Decorator Support: Automatically time function executions with
@timer.timedand@timer.timed_async - Lap Timing: Record intermediate times within a timing context
- Historical Tracking: All timing invocations are stored in lists for statistical analysis
- Benchmarking: Run functions multiple times with warmup support
- Async Support: Full support for async functions with
@timer.timed_async - Zero Dependencies: Uses only Python standard library
Requirements: Python 3.12+
Install with uv (recommended):
uv pip install -e .
python -c "import timekid; print('timekid import OK')"Or with pip:
pip install -e .
python -c "import timekid; print('timekid import OK')"Verify the expected import works:
python -c "import timekid; print('timekid OK')"If you need minimal overhead and are doing a large number of measurements, use FastTimer.
It stores raw integer nanoseconds internally and only converts to seconds when you report.
from timekid.fast import FastTimer
ft = FastTimer()
key = ft.key_id("hot_loop") # do string->id once
for _ in range(1000):
tok = ft.start(key)
# ... hot code ...
ft.stop(tok)
print(ft.times_s(precision=6)[key][:5])(Planned) a future optional Rust backend can implement the same API for even lower overhead.
from timekid.timer import Timer
timer = Timer(precision=3)
with timer['database_query']:
# Your code here
result = execute_query()
# Access timing (returns list of floats)
print(f"Query took {timer.times['database_query'][0]}s")from timekid.timer import Timer
timer = Timer(precision=2)
@timer.timed
def process_data(data):
# Your processing logic
return processed_data
# Call multiple times
for item in items:
process_data(item)
# Analyze all invocations
times = timer.times['process_data']
print(f"Average: {sum(times) / len(times):.3f}s")
print(f"Min: {min(times):.3f}s, Max: {max(times):.3f}s")with timer['data_pipeline'] as t:
load_data()
t.lap() # Record lap 1
transform_data()
t.lap() # Record lap 2
save_data()
# Final lap recorded automatically on exit
# Access lap times
contexts = timer.get('data_pipeline')
print(f"Load: {contexts[0].laps[0]}s")
print(f"Transform: {contexts[0].laps[1]}s")
print(f"Save: {contexts[0].laps[2]}s")import asyncio
from timekid.timer import Timer
timer = Timer()
@timer.timed_async
async def fetch_data(url):
async with aiohttp.ClientSession() as session:
async with session.get(url) as response:
return await response.json()
# Use normally
await fetch_data('https://api.example.com/data')from timekid.timer import Timer
timer = Timer()
# Benchmark a function with 1000 iterations (not stored in the registry by default)
results = timer.benchmark(my_function, num_iter=1000, warmup=1, arg1, arg2)
# Optionally persist benchmark runs in the timer registry
timer.benchmark(my_function, num_iter=1000, arg1, arg2, store=True)
print(len(timer.times['my_function benchmark']))
# Optionally provide a custom registry key when storing
custom_key = 'bench.my_function.hot_path'
timer.benchmark(my_function, num_iter=1000, arg1, arg2, store=True, key=custom_key)
print(len(timer.times[custom_key]))
# Analyze results
times = [r.elapsed_time for r in results]
avg_time = sum(times) / len(times)
print(f"Average: {avg_time:.6f}s")
# If you want benchmark iterations to appear in timer.times / timer.contexts:
# (stored under "<func_name> benchmark")
_ = timer.benchmark(my_function, num_iter=1000, warmup=1, store=True, arg1, arg2)
print(len(timer.times["my_function benchmark"]))from timekid.timer import StopWatch
sw = StopWatch(precision=2)
sw.start()
# Your code here
do_something()
elapsed = sw.stop()
print(f"Elapsed: {elapsed}s")All timer registry values are stored as lists of TimerContext objects. This enables:
- Historical tracking: Every invocation is preserved
- Statistical analysis: Calculate min, max, average, standard deviation
- Performance trends: Track performance over time
- Consistent API: No mix of single values and lists
# Multiple invocations create list entries
with timer['task']:
do_work()
with timer['task']:
do_work()
# Access all timings
all_times = timer.times['task'] # Returns list[float]
first_time = timer.times['task'][0]
latest_time = timer.times['task'][-1]Decorated functions store all invocations under the function name (no numbered keys):
@timer.timed
def process(item):
return item * 2
# Call 3 times
process(1)
process(2)
process(3)
# All stored under 'process' key
print(len(timer.times['process'])) # Output: 3All timer types accept an optional precision parameter for rounding:
timer = Timer(precision=3) # Round to 3 decimal places
with timer['task']:
time.sleep(0.123456)
print(timer.times['task'][0]) # Output: 0.123Enable verbose logging to see timing events in real-time:
import logging
logger = logging.getLogger(__name__)
timer = Timer(verbose=True, log_func=logger.info)
with timer['task']:
# Logs start and stop events
do_work()Main registry-based interface for timing operations.
Timer(precision: Optional[int] = None, verbose: bool = False, log_func: Callable[[str], None] = print)Properties:
times: Dict[str, list[float]]- All elapsed times for succeeded timerscontexts: Dict[str, list[TimerContext]]- All timer contextsprecision: Optional[int]- Configured precision for rounding
Methods:
timer['key']- Create/access timer context (creates new context each time)timed(func)- Decorator for synchronous functionstimed_async(func)- Decorator for async functionsget(key: str)- Get all contexts matching a keystatus(key: str)- Get list of statuses for a keysorted(reverse: bool = False)- Get timers sorted by elapsed timetime_call(func, *args, **kwargs)- Time a single function call (preferred name)timeit(func, *args, **kwargs)- Deprecated alias fortime_callbenchmark(func, num_iter: int, warmup: int = 1, *args, store: bool = False, key: Optional[str] = None, **kwargs)- Benchmark function with multiple iterations (optionally stored in registry; custom key supported)anonymous(name, verbose, log_func)- Create anonymous timer context (not stored in registry)
Note:
Timer.timeit(...)has been replaced byTimer.time_call(...)for clarity and to avoid confusion with the stdlibtimeitmodule.
Context manager for timing code blocks.
TimerContext(precision: Optional[int], name: Optional[str] = None, verbose: bool = False, log_func: Callable[[str], None] = print)Properties:
elapsed_time: float- Total elapsed timelaps: list[float]- List of lap timesstatus: Status- Current status (PENDING/RUNNING/SUCCEEDED/FAILED)name: str- Timer name
Methods:
lap()- Record intermediate timereset()- Reset timer (clears laps, starts from now)rename(name: str)- Change timer name
Simple imperative timer with manual control.
StopWatch(precision: Optional[int] = None)Properties:
elapsed_time: float- Elapsed time (raises error if not started)status: Status- Current status
Methods:
start()- Start timingstop()- Stop timing and return elapsed timereset()- Reset to initial state
Timer lifecycle states:
Status.PENDING- Created but not startedStatus.RUNNING- Currently timingStatus.STOPPED- Manually stopped (StopWatch only)Status.SUCCEEDED- Context exited normallyStatus.FAILED- Context exited with exception
Run tests with unittest:
python -m unittest tests._basic_test -vThis project uses uv for package management:
# Install in editable mode
uv pip install -e .
# Run tests
.venv/bin/python -m unittest tests._basic_test -v
# Run examples
python -m timekid.timerThe project includes GitHub Actions workflow for automated testing:
- Runs on Python 3.12 and 3.13
- Tests on push/PR to main, master, or develop branches
- Uses
uvfor dependency management
If upgrading from a version with mixed single/list registry values:
# Old way (single values)
elapsed = timer.times['my_task'] # Was a float
# New way (list values)
elapsed = timer.times['my_task'][0] # First timing
elapsed = timer.times['my_task'][-1] # Latest timingContributions are welcome! Please ensure:
- Python 3.12+ compatibility
- All tests pass
- Type hints for all functions
- Update documentation as needed
MIT License
Peter Vestereng Larsen (p.vesterenglarsen@gmail.com)