timeline-forensics

Compare original and translation side by side

🇺🇸

Original

English
🇨🇳

Translation

Chinese

Timeline Forensics

取证时间线

Comprehensive timeline forensics skill for creating and analyzing forensic timelines from multiple data sources. Enables super timeline creation, event correlation, anomaly detection, and visualization of activities across disk, memory, network, and log sources.
这是一款全面的取证时间线工具,可从多个数据源创建并分析取证时间线。支持创建超级时间线、事件关联、异常检测,以及对磁盘、内存、网络和日志来源的活动进行可视化。

Capabilities

功能特性

  • Super Timeline Creation: Create comprehensive timelines from multiple sources
  • Multi-Source Correlation: Correlate events across different artifact types
  • Event Filtering: Filter timelines by time, source, or keyword
  • Anomaly Detection: Identify unusual patterns and outliers
  • Timeline Visualization: Create interactive timeline visualizations
  • Gap Analysis: Identify missing time periods in evidence
  • Pivot Point Analysis: Find key events and pivot around them
  • Export Formats: Export to CSV, JSON, bodyfile, and other formats
  • Timeline Comparison: Compare timelines from different systems
  • Activity Clustering: Group related events into activities
  • 超级时间线创建:从多个来源创建全面的时间线
  • 多源事件关联:关联不同类型取证工件中的事件
  • 事件过滤:按时间、来源或关键词过滤时间线
  • 异常检测:识别异常模式和离群值
  • 时间线可视化:创建交互式时间线可视化图表
  • 间隙分析:识别证据中的时间缺失段
  • 关键点分析:定位关键事件并围绕其展开分析
  • 导出格式:支持导出为CSV、JSON、bodyfile等格式
  • 时间线对比:对比不同系统的时间线
  • 活动聚类:将相关事件分组为活动集合

Quick Start

快速开始

python
from timeline_forensics import TimelineBuilder, SuperTimeline, TimelineAnalyzer
python
from timeline_forensics import TimelineBuilder, SuperTimeline, TimelineAnalyzer

Create super timeline

Create super timeline

builder = TimelineBuilder() builder.add_disk_image("/evidence/disk.E01") builder.add_memory_dump("/evidence/memory.raw") builder.add_logs("/evidence/logs/")
timeline = builder.build()
builder = TimelineBuilder() builder.add_disk_image("/evidence/disk.E01") builder.add_memory_dump("/evidence/memory.raw") builder.add_logs("/evidence/logs/")
timeline = builder.build()

Analyze timeline

Analyze timeline

analyzer = TimelineAnalyzer(timeline) anomalies = analyzer.detect_anomalies()
undefined
analyzer = TimelineAnalyzer(timeline) anomalies = analyzer.detect_anomalies()
undefined

Usage

使用指南

Task 1: Super Timeline Creation

任务1:超级时间线创建

Input: Multiple forensic artifacts
Process:
  1. Add all evidence sources
  2. Parse timestamps from each source
  3. Normalize to UTC
  4. Merge into unified timeline
  5. Generate output
Output: Comprehensive super timeline
Example:
python
from timeline_forensics import TimelineBuilder
输入:多个取证工件
流程:
  1. 添加所有证据来源
  2. 解析每个来源的时间戳
  3. 统一转换为UTC时间
  4. 合并为统一时间线
  5. 生成输出结果
输出:全面的超级时间线
示例:
python
from timeline_forensics import TimelineBuilder

Initialize timeline builder

Initialize timeline builder

builder = TimelineBuilder( case_id="CASE-2024-001", timezone="UTC" )
builder = TimelineBuilder( case_id="CASE-2024-001", timezone="UTC" )

Add disk image (will parse MFT, registry, etc.)

Add disk image (will parse MFT, registry, etc.)

builder.add_disk_image( image_path="/evidence/disk.E01", parsers=["mft", "registry", "prefetch", "evtx", "browser"] )
builder.add_disk_image( image_path="/evidence/disk.E01", parsers=["mft", "registry", "prefetch", "evtx", "browser"] )

Add memory dump

Add memory dump

builder.add_memory_dump("/evidence/memory.raw")
builder.add_memory_dump("/evidence/memory.raw")

Add log files

Add log files

builder.add_logs("/evidence/logs/")
builder.add_logs("/evidence/logs/")

Add PCAP

Add PCAP

builder.add_pcap("/evidence/capture.pcap")
builder.add_pcap("/evidence/capture.pcap")

Add custom events

Add custom events

builder.add_custom_event( timestamp="2024-01-15T10:30:00Z", source="analyst", description="Incident reported by user", event_type="incident_report" )
builder.add_custom_event( timestamp="2024-01-15T10:30:00Z", source="analyst", description="Incident reported by user", event_type="incident_report" )

Build timeline

Build timeline

timeline = builder.build()
print(f"Total events: {timeline.event_count}") print(f"Time range: {timeline.start_time} - {timeline.end_time}") print(f"Sources: {timeline.sources}")
timeline = builder.build()
print(f"Total events: {timeline.event_count}") print(f"Time range: {timeline.start_time} - {timeline.end_time}") print(f"Sources: {timeline.sources}")

Export timeline

Export timeline

timeline.export_csv("/evidence/timeline/supertimeline.csv") timeline.export_json("/evidence/timeline/supertimeline.json") timeline.export_bodyfile("/evidence/timeline/bodyfile.txt")
timeline.export_csv("/evidence/timeline/supertimeline.csv") timeline.export_json("/evidence/timeline/supertimeline.json") timeline.export_bodyfile("/evidence/timeline/bodyfile.txt")

Generate timeline report

Generate timeline report

builder.generate_report("/evidence/timeline/timeline_report.html")
undefined
builder.generate_report("/evidence/timeline/timeline_report.html")
undefined

Task 2: File System Timeline

任务2:文件系统时间线

Input: Disk image or file system
Process:
  1. Parse MFT/inode tables
  2. Extract all timestamps
  3. Handle MAC times
  4. Detect timestomping
  5. Build file timeline
Output: File system timeline
Example:
python
from timeline_forensics import FileSystemTimeline
输入:磁盘镜像或文件系统
流程:
  1. 解析MFT/索引节点表
  2. 提取所有时间戳
  3. 处理MAC时间
  4. 检测时间篡改
  5. 构建文件时间线
输出:文件系统时间线
示例:
python
from timeline_forensics import FileSystemTimeline

Initialize file system timeline

Initialize file system timeline

fst = FileSystemTimeline("/evidence/disk.E01")
fst = FileSystemTimeline("/evidence/disk.E01")

Parse file system

Parse file system

fst.parse()
fst.parse()

Get all events

Get all events

events = fst.get_events() for event in events[:10]: print(f"[{event.timestamp}] {event.event_type}") print(f" File: {event.filename}") print(f" Path: {event.full_path}") print(f" Source: {event.timestamp_source}") # mtime, atime, ctime, crtime
events = fst.get_events() for event in events[:10]: print(f"[{event.timestamp}] {event.event_type}") print(f" File: {event.filename}") print(f" Path: {event.full_path}") print(f" Source: {event.timestamp_source}") # mtime, atime, ctime, crtime

Get events for specific file

Get events for specific file

file_events = fst.get_file_events("/Users/suspect/malware.exe") for event in file_events: print(f"[{event.timestamp}] {event.event_type}") print(f" Timestamp type: {event.timestamp_source}")
file_events = fst.get_file_events("/Users/suspect/malware.exe")

Detect timestomping

Detect timestomping

anomalies = fst.detect_timestamp_anomalies() for a in anomalies: print(f"ANOMALY: {a.file_path}") print(f" Type: {a.anomaly_type}") print(f" Evidence: {a.evidence}")
anomalies = fst.detect_timestamp_anomalies() for a in anomalies: print(f"ANOMALY: {a.file_path}") print(f" Type: {a.anomaly_type}") print(f" Evidence: {a.evidence}")

Get recently modified files

Get recently modified files

recent = fst.get_files_modified_after("2024-01-15T00:00:00Z")
recent = fst.get_files_modified_after("2024-01-15T00:00:00Z")

Get files created during incident window

Get files created during incident window

incident_files = fst.get_files_in_range( start="2024-01-15T10:00:00Z", end="2024-01-15T12:00:00Z", event_types=["created", "modified"] )
incident_files = fst.get_files_in_range( start="2024-01-15T10:00:00Z", end="2024-01-15T12:00:00Z", event_types=["created", "modified"] )

Export file system timeline

Export file system timeline

fst.export("/evidence/timeline/filesystem.csv")
undefined
fst.export("/evidence/timeline/filesystem.csv")
undefined

Task 3: Registry Timeline

任务3:注册表时间线

Input: Registry hives
Process:
  1. Parse registry key timestamps
  2. Extract last-write times
  3. Build key timeline
  4. Identify rapid changes
  5. Correlate with events
Output: Registry timeline
Example:
python
from timeline_forensics import RegistryTimeline
输入:注册表配置单元
流程:
  1. 解析注册表键时间戳
  2. 提取最后写入时间
  3. 构建键时间线
  4. 识别快速变更
  5. 与事件关联
输出:注册表时间线
示例:
python
from timeline_forensics import RegistryTimeline

Initialize registry timeline

Initialize registry timeline

rt = RegistryTimeline()
rt = RegistryTimeline()

Add registry hives

Add registry hives

rt.add_hive("/evidence/registry/SYSTEM") rt.add_hive("/evidence/registry/SOFTWARE") rt.add_hive("/evidence/registry/NTUSER.DAT")
rt.add_hive("/evidence/registry/SYSTEM") rt.add_hive("/evidence/registry/SOFTWARE") rt.add_hive("/evidence/registry/NTUSER.DAT")

Build timeline

Build timeline

rt.build()
rt.build()

Get all events

Get all events

events = rt.get_events() for event in events[:10]: print(f"[{event.timestamp}] Registry modification") print(f" Hive: {event.hive}") print(f" Key: {event.key_path}")
events = rt.get_events() for event in events[:10]: print(f"[{event.timestamp}] Registry modification") print(f" Hive: {event.hive}") print(f" Key: {event.key_path}")

Get events for specific key

Get events for specific key

run_events = rt.get_key_events("Software\Microsoft\Windows\CurrentVersion\Run")
run_events = rt.get_key_events("Software\Microsoft\Windows\CurrentVersion\Run")

Find rapid modifications (potential automation)

Find rapid modifications (potential automation)

rapid = rt.find_rapid_modifications( threshold_seconds=60, min_changes=10 ) for r in rapid: print(f"Rapid changes at {r.start_time}:") print(f" Keys modified: {r.key_count}") print(f" Duration: {r.duration_seconds}s")
rapid = rt.find_rapid_modifications( threshold_seconds=60, min_changes=10 ) for r in rapid: print(f"Rapid changes at {r.start_time}:") print(f" Keys modified: {r.key_count}") print(f" Duration: {r.duration_seconds}s")

Get modifications in time range

Get modifications in time range

incident_mods = rt.get_modifications_in_range( start="2024-01-15T10:00:00Z", end="2024-01-15T12:00:00Z" )
incident_mods = rt.get_modifications_in_range( start="2024-01-15T10:00:00Z", end="2024-01-15T12:00:00Z" )

Export registry timeline

Export registry timeline

rt.export("/evidence/timeline/registry.csv")
undefined
rt.export("/evidence/timeline/registry.csv")
undefined

Task 4: Event Log Timeline

任务4:事件日志时间线

Input: Windows Event Logs
Process:
  1. Parse EVTX files
  2. Extract timestamps
  3. Categorize events
  4. Build log timeline
  5. Identify patterns
Output: Event log timeline
Example:
python
from timeline_forensics import EventLogTimeline
输入:Windows事件日志
流程:
  1. 解析EVTX文件
  2. 提取时间戳
  3. 分类事件
  4. 构建日志时间线
  5. 识别模式
输出:事件日志时间线
示例:
python
from timeline_forensics import EventLogTimeline

Initialize event log timeline

Initialize event log timeline

elt = EventLogTimeline()
elt = EventLogTimeline()

Add event logs

Add event logs

elt.add_log("/evidence/logs/Security.evtx") elt.add_log("/evidence/logs/System.evtx") elt.add_log("/evidence/logs/Application.evtx") elt.add_directory("/evidence/logs/")
elt.add_log("/evidence/logs/Security.evtx") elt.add_log("/evidence/logs/System.evtx") elt.add_log("/evidence/logs/Application.evtx") elt.add_directory("/evidence/logs/")

Build timeline

Build timeline

elt.build()
elt.build()

Get all events

Get all events

events = elt.get_events() for event in events[:10]: print(f"[{event.timestamp}] {event.log_name}") print(f" Event ID: {event.event_id}") print(f" Description: {event.description}")
events = elt.get_events() for event in events[:10]: print(f"[{event.timestamp}] {event.log_name}") print(f" Event ID: {event.event_id}") print(f" Description: {event.description}")

Get security events

Get security events

security_events = elt.get_events_by_log("Security")
security_events = elt.get_events_by_log("Security")

Get specific event IDs

Get specific event IDs

login_events = elt.get_events_by_id([4624, 4625]) for event in login_events: print(f"[{event.timestamp}] Login event {event.event_id}") print(f" User: {event.user}") print(f" Source IP: {event.source_ip}")
login_events = elt.get_events_by_id([4624, 4625]) for event in login_events: print(f"[{event.timestamp}] Login event {event.event_id}") print(f" User: {event.user}") print(f" Source IP: {event.source_ip}")

Find event sequences

Find event sequences

sequences = elt.find_event_sequences([ {"event_id": 4624, "description": "Login"}, {"event_id": 4688, "description": "Process creation"}, {"event_id": 4689, "description": "Process exit"} ])
sequences = elt.find_event_sequences([ {"event_id": 4624, "description": "Login"}, {"event_id": 4688, "description": "Process creation"}, {"event_id": 4689, "description": "Process exit"} ])

Export event log timeline

Export event log timeline

elt.export("/evidence/timeline/eventlogs.csv")
undefined
elt.export("/evidence/timeline/eventlogs.csv")
undefined

Task 5: Network Timeline

任务5:网络时间线

Input: Network captures
Process:
  1. Parse PCAP files
  2. Extract connection timestamps
  3. Track sessions
  4. Build network timeline
  5. Correlate with activity
Output: Network activity timeline
Example:
python
from timeline_forensics import NetworkTimeline
输入:网络捕获文件
流程:
  1. 解析PCAP文件
  2. 提取连接时间戳
  3. 跟踪会话
  4. 构建网络时间线
  5. 与活动关联
输出:网络活动时间线
示例:
python
from timeline_forensics import NetworkTimeline

Initialize network timeline

Initialize network timeline

nt = NetworkTimeline()
nt = NetworkTimeline()

Add network captures

Add network captures

nt.add_pcap("/evidence/network/capture1.pcap") nt.add_pcap("/evidence/network/capture2.pcap")
nt.add_pcap("/evidence/network/capture1.pcap") nt.add_pcap("/evidence/network/capture2.pcap")

Add flow data

Add flow data

nt.add_netflow("/evidence/network/flows/")
nt.add_netflow("/evidence/network/flows/")

Build timeline

Build timeline

nt.build()
nt.build()

Get all events

Get all events

events = nt.get_events() for event in events[:10]: print(f"[{event.timestamp}] {event.event_type}") print(f" Source: {event.src_ip}:{event.src_port}") print(f" Destination: {event.dst_ip}:{event.dst_port}") print(f" Protocol: {event.protocol}")
events = nt.get_events() for event in events[:10]: print(f"[{event.timestamp}] {event.event_type}") print(f" Source: {event.src_ip}:{event.src_port}") print(f" Destination: {event.dst_ip}:{event.dst_port}") print(f" Protocol: {event.protocol}")

Get connections to specific IP

Get connections to specific IP

c2_connections = nt.get_connections_to_ip("203.0.113.50")
c2_connections = nt.get_connections_to_ip("203.0.113.50")

Get DNS queries

Get DNS queries

dns_events = nt.get_dns_events() for event in dns_events: print(f"[{event.timestamp}] DNS: {event.query}")
dns_events = nt.get_dns_events() for event in dns_events: print(f"[{event.timestamp}] DNS: {event.query}")

Get HTTP events

Get HTTP events

http_events = nt.get_http_events() for event in http_events: print(f"[{event.timestamp}] HTTP: {event.method} {event.url}")
http_events = nt.get_http_events() for event in http_events: print(f"[{event.timestamp}] HTTP: {event.method} {event.url}")

Find data transfers

Find data transfers

transfers = nt.find_large_transfers(min_bytes=1000000)
transfers = nt.find_large_transfers(min_bytes=1000000)

Export network timeline

Export network timeline

nt.export("/evidence/timeline/network.csv")
undefined
nt.export("/evidence/timeline/network.csv")
undefined

Task 6: Timeline Correlation

任务6:时间线关联

Input: Multiple timelines or super timeline
Process:
  1. Align timestamps
  2. Find temporal correlations
  3. Identify related events
  4. Build event chains
  5. Document relationships
Output: Correlated timeline analysis
Example:
python
from timeline_forensics import TimelineCorrelator
输入:多个时间线或超级时间线
流程:
  1. 对齐时间戳
  2. 查找时间关联
  3. 识别相关事件
  4. 构建事件链
  5. 记录关联关系
输出:关联后的时间线分析结果
示例:
python
from timeline_forensics import TimelineCorrelator

Initialize correlator with super timeline

Initialize correlator with super timeline

correlator = TimelineCorrelator("/evidence/timeline/supertimeline.csv")
correlator = TimelineCorrelator("/evidence/timeline/supertimeline.csv")

Find events around pivot point

Find events around pivot point

pivot = correlator.get_events_around( timestamp="2024-01-15T10:30:00Z", window_minutes=30 ) for event in pivot: print(f"[{event.timestamp}] {event.source}: {event.description}")
pivot = correlator.get_events_around( timestamp="2024-01-15T10:30:00Z", window_minutes=30 ) for event in pivot: print(f"[{event.timestamp}] {event.source}: {event.description}")

Correlate by IP address

Correlate by IP address

ip_activity = correlator.correlate_by_ip("192.168.1.100") print(f"Events related to IP: {len(ip_activity)}")
ip_activity = correlator.correlate_by_ip("192.168.1.100") print(f"Events related to IP: {len(ip_activity)}")

Correlate by filename

Correlate by filename

file_activity = correlator.correlate_by_filename("malware.exe") print(f"Events related to file: {len(file_activity)}")
file_activity = correlator.correlate_by_filename("malware.exe") print(f"Events related to file: {len(file_activity)}")

Correlate by user

Correlate by user

user_activity = correlator.correlate_by_user("DOMAIN\suspect")
user_activity = correlator.correlate_by_user("DOMAIN\suspect")

Find event chains

Find event chains

chains = correlator.find_event_chains() for chain in chains: print(f"Chain: {chain.name}") print(f" Events: {len(chain.events)}") print(f" Duration: {chain.duration}") for event in chain.events: print(f" [{event.timestamp}] {event.description}")
chains = correlator.find_event_chains() for chain in chains: print(f"Chain: {chain.name}") print(f" Events: {len(chain.events)}") print(f" Duration: {chain.duration}") for event in chain.events: print(f" [{event.timestamp}] {event.description}")

Detect temporal anomalies

Detect temporal anomalies

anomalies = correlator.detect_temporal_anomalies() for a in anomalies: print(f"ANOMALY: {a.description}") print(f" Events: {a.events}")
anomalies = correlator.detect_temporal_anomalies() for a in anomalies: print(f"ANOMALY: {a.description}") print(f" Events: {a.events}")

Generate correlation report

Generate correlation report

correlator.generate_report("/evidence/timeline/correlation.html")
undefined
correlator.generate_report("/evidence/timeline/correlation.html")
undefined

Task 7: Timeline Filtering

任务7:时间线过滤

Input: Timeline data
Process:
  1. Apply time filters
  2. Apply source filters
  3. Apply keyword filters
  4. Reduce noise
  5. Focus investigation
Output: Filtered timeline
Example:
python
from timeline_forensics import TimelineFilter
输入:时间线数据
流程:
  1. 应用时间过滤
  2. 应用来源过滤
  3. 应用关键词过滤
  4. 减少噪声
  5. 聚焦调查重点
输出:过滤后的时间线
示例:
python
from timeline_forensics import TimelineFilter

Initialize filter with timeline

Initialize filter with timeline

filter = TimelineFilter("/evidence/timeline/supertimeline.csv")
filter = TimelineFilter("/evidence/timeline/supertimeline.csv")

Filter by time range

Filter by time range

time_filtered = filter.by_time_range( start="2024-01-15T10:00:00Z", end="2024-01-15T12:00:00Z" ) print(f"Events in time range: {len(time_filtered)}")
time_filtered = filter.by_time_range( start="2024-01-15T10:00:00Z", end="2024-01-15T12:00:00Z" ) print(f"Events in time range: {len(time_filtered)}")

Filter by source

Filter by source

source_filtered = filter.by_source(["MFT", "Registry", "EventLog"])
source_filtered = filter.by_source(["MFT", "Registry", "EventLog"])

Filter by keyword

Filter by keyword

keyword_filtered = filter.by_keyword( keywords=["malware", "suspicious", "admin"], case_sensitive=False )
keyword_filtered = filter.by_keyword( keywords=["malware", "suspicious", "admin"], case_sensitive=False )

Filter by event type

Filter by event type

type_filtered = filter.by_event_type(["file_created", "process_start"])
type_filtered = filter.by_event_type(["file_created", "process_start"])

Exclude noise

Exclude noise

noise_excluded = filter.exclude_patterns([ "Windows\Prefetch\.pf", "$RECYCLE.BIN", "pagefile.sys" ])
noise_excluded = filter.exclude_patterns([ "Windows\Prefetch\.pf", "$RECYCLE.BIN", "pagefile.sys" ])

Complex filter

Complex filter

complex_filtered = filter.complex_filter( time_start="2024-01-15T10:00:00Z", time_end="2024-01-15T12:00:00Z", sources=["MFT", "Registry"], keywords=["malware"], exclude_patterns=["TEMP"] )
complex_filtered = filter.complex_filter( time_start="2024-01-15T10:00:00Z", time_end="2024-01-15T12:00:00Z", sources=["MFT", "Registry"], keywords=["malware"], exclude_patterns=["TEMP"] )

Export filtered timeline

Export filtered timeline

filter.export_filtered("/evidence/timeline/filtered.csv", complex_filtered)
undefined
filter.export_filtered("/evidence/timeline/filtered.csv", complex_filtered)
undefined

Task 8: Timeline Visualization

任务8:时间线可视化

Input: Timeline data
Process:
  1. Prepare visualization data
  2. Create interactive charts
  3. Generate heat maps
  4. Build activity graphs
  5. Export visualizations
Output: Timeline visualizations
Example:
python
from timeline_forensics import TimelineVisualizer
输入:时间线数据
流程:
  1. 准备可视化数据
  2. 创建交互式图表
  3. 生成热图
  4. 构建活动图
  5. 导出可视化结果
输出:时间线可视化内容
示例:
python
from timeline_forensics import TimelineVisualizer

Initialize visualizer

Initialize visualizer

viz = TimelineVisualizer("/evidence/timeline/supertimeline.csv")
viz = TimelineVisualizer("/evidence/timeline/supertimeline.csv")

Create interactive timeline

Create interactive timeline

viz.create_interactive_timeline( output_path="/evidence/timeline/interactive.html", title="Incident Timeline", highlight_events=["malware.exe", "suspicious"] )
viz.create_interactive_timeline( output_path="/evidence/timeline/interactive.html", title="Incident Timeline", highlight_events=["malware.exe", "suspicious"] )

Create activity heatmap

Create activity heatmap

viz.create_heatmap( output_path="/evidence/timeline/heatmap.html", granularity="hour" )
viz.create_heatmap( output_path="/evidence/timeline/heatmap.html", granularity="hour" )

Create source distribution chart

Create source distribution chart

viz.create_source_chart( output_path="/evidence/timeline/sources.html" )
viz.create_source_chart( output_path="/evidence/timeline/sources.html" )

Create event type distribution

Create event type distribution

viz.create_event_type_chart( output_path="/evidence/timeline/event_types.html" )
viz.create_event_type_chart( output_path="/evidence/timeline/event_types.html" )

Create activity sparkline

Create activity sparkline

viz.create_activity_sparkline( output_path="/evidence/timeline/activity.png", window="day" )
viz.create_activity_sparkline( output_path="/evidence/timeline/activity.png", window="day" )

Create network graph

Create network graph

viz.create_event_graph( output_path="/evidence/timeline/event_graph.html", relationship_type="temporal" )
viz.create_event_graph( output_path="/evidence/timeline/event_graph.html", relationship_type="temporal" )

Generate full visualization report

Generate full visualization report

viz.generate_visualization_report( output_dir="/evidence/timeline/viz/", include_all=True )
undefined
viz.generate_visualization_report( output_dir="/evidence/timeline/viz/", include_all=True )
undefined

Task 9: Gap Analysis

任务9:间隙分析

Input: Timeline data
Process:
  1. Analyze event distribution
  2. Identify time gaps
  3. Detect missing periods
  4. Assess evidence coverage
  5. Document gaps
Output: Gap analysis report
Example:
python
from timeline_forensics import GapAnalyzer
输入:时间线数据
流程:
  1. 分析事件分布
  2. 识别时间间隙
  3. 检测缺失时段
  4. 评估证据覆盖范围
  5. 记录间隙信息
输出:间隙分析报告
示例:
python
from timeline_forensics import GapAnalyzer

Initialize gap analyzer

Initialize gap analyzer

analyzer = GapAnalyzer("/evidence/timeline/supertimeline.csv")
analyzer = GapAnalyzer("/evidence/timeline/supertimeline.csv")

Find gaps in timeline

Find gaps in timeline

gaps = analyzer.find_gaps(min_gap_minutes=60) for gap in gaps: print(f"GAP: {gap.start_time} - {gap.end_time}") print(f" Duration: {gap.duration_minutes} minutes") print(f" Events before: {gap.events_before}") print(f" Events after: {gap.events_after}")
gaps = analyzer.find_gaps(min_gap_minutes=60) for gap in gaps: print(f"GAP: {gap.start_time} - {gap.end_time}") print(f" Duration: {gap.duration_minutes} minutes") print(f" Events before: {gap.events_before}") print(f" Events after: {gap.events_after}")

Analyze coverage by source

Analyze coverage by source

coverage = analyzer.analyze_source_coverage() for source, cov in coverage.items(): print(f"Source: {source}") print(f" First event: {cov.first_event}") print(f" Last event: {cov.last_event}") print(f" Coverage: {cov.coverage_percent}%") print(f" Gaps: {cov.gap_count}")
coverage = analyzer.analyze_source_coverage() for source, cov in coverage.items(): print(f"Source: {source}") print(f" First event: {cov.first_event}") print(f" Last event: {cov.last_event}") print(f" Coverage: {cov.coverage_percent}%") print(f" Gaps: {cov.gap_count}")

Find suspicious gaps

Find suspicious gaps

suspicious = analyzer.find_suspicious_gaps() for gap in suspicious: print(f"SUSPICIOUS GAP: {gap.start_time} - {gap.end_time}") print(f" Reason: {gap.reason}")
suspicious = analyzer.find_suspicious_gaps() for gap in suspicious: print(f"SUSPICIOUS GAP: {gap.start_time} - {gap.end_time}") print(f" Reason: {gap.reason}")

Analyze activity distribution

Analyze activity distribution

distribution = analyzer.analyze_distribution() print(f"Peak hours: {distribution.peak_hours}") print(f"Quiet hours: {distribution.quiet_hours}") print(f"Average events/hour: {distribution.avg_events_per_hour}")
distribution = analyzer.analyze_distribution() print(f"Peak hours: {distribution.peak_hours}") print(f"Quiet hours: {distribution.quiet_hours}") print(f"Average events/hour: {distribution.avg_events_per_hour}")

Generate gap report

Generate gap report

analyzer.generate_report("/evidence/timeline/gap_analysis.html")
undefined
analyzer.generate_report("/evidence/timeline/gap_analysis.html")
undefined

Task 10: Timeline Analysis

任务10:时间线分析

Input: Timeline data
Process:
  1. Statistical analysis
  2. Pattern detection
  3. Anomaly identification
  4. Activity clustering
  5. Investigation support
Output: Timeline analysis results
Example:
python
from timeline_forensics import TimelineAnalyzer
输入:时间线数据
流程:
  1. 统计分析
  2. 模式检测
  3. 异常识别
  4. 活动聚类
  5. 调查支持
输出:时间线分析结果
示例:
python
from timeline_forensics import TimelineAnalyzer

Initialize analyzer

Initialize analyzer

analyzer = TimelineAnalyzer("/evidence/timeline/supertimeline.csv")
analyzer = TimelineAnalyzer("/evidence/timeline/supertimeline.csv")

Get timeline statistics

Get timeline statistics

stats = analyzer.get_statistics() print(f"Total events: {stats.total_events}") print(f"Time span: {stats.time_span}") print(f"Sources: {stats.source_count}") print(f"Event types: {stats.event_type_count}") print(f"Unique files: {stats.unique_files}")
stats = analyzer.get_statistics() print(f"Total events: {stats.total_events}") print(f"Time span: {stats.time_span}") print(f"Sources: {stats.source_count}") print(f"Event types: {stats.event_type_count}") print(f"Unique files: {stats.unique_files}")

Detect anomalies

Detect anomalies

anomalies = analyzer.detect_anomalies() for a in anomalies: print(f"ANOMALY: {a.type}") print(f" Description: {a.description}") print(f" Timestamp: {a.timestamp}") print(f" Confidence: {a.confidence}")
anomalies = analyzer.detect_anomalies() for a in anomalies: print(f"ANOMALY: {a.type}") print(f" Description: {a.description}") print(f" Timestamp: {a.timestamp}") print(f" Confidence: {a.confidence}")

Find patterns

Find patterns

patterns = analyzer.find_patterns() for p in patterns: print(f"Pattern: {p.name}") print(f" Occurrences: {p.count}") print(f" Description: {p.description}")
patterns = analyzer.find_patterns() for p in patterns: print(f"Pattern: {p.name}") print(f" Occurrences: {p.count}") print(f" Description: {p.description}")

Cluster related events

Cluster related events

clusters = analyzer.cluster_events() for cluster in clusters: print(f"Cluster: {cluster.label}") print(f" Events: {cluster.event_count}") print(f" Time range: {cluster.start_time} - {cluster.end_time}")
clusters = analyzer.cluster_events() for cluster in clusters: print(f"Cluster: {cluster.label}") print(f" Events: {cluster.event_count}") print(f" Time range: {cluster.start_time} - {cluster.end_time}")

Get investigation suggestions

Get investigation suggestions

suggestions = analyzer.get_investigation_suggestions() for s in suggestions: print(f"SUGGESTION: {s.title}") print(f" Priority: {s.priority}") print(f" Description: {s.description}") print(f" Related events: {s.event_count}")
suggestions = analyzer.get_investigation_suggestions() for s in suggestions: print(f"SUGGESTION: {s.title}") print(f" Priority: {s.priority}") print(f" Description: {s.description}") print(f" Related events: {s.event_count}")

Generate analysis report

Generate analysis report

analyzer.generate_report("/evidence/timeline/analysis.html")
undefined
analyzer.generate_report("/evidence/timeline/analysis.html")
undefined

Configuration

配置

Environment Variables

环境变量

VariableDescriptionRequiredDefault
PLASO_PATH
Path to Plaso toolsNoSystem PATH
TIMELINE_TZ
Default timezoneNoUTC
MAX_EVENTS
Maximum events to processNo10000000
CACHE_DIR
Timeline cache directoryNo./cache
变量名描述是否必填默认值
PLASO_PATH
Plaso工具的路径系统PATH
TIMELINE_TZ
默认时区UTC
MAX_EVENTS
可处理的最大事件数10000000
CACHE_DIR
时间线缓存目录./cache

Options

配置选项

OptionTypeDescription
normalize_timezone
booleanNormalize to UTC
deduplicate
booleanRemove duplicate events
parallel_parsing
booleanParallel source parsing
cache_results
booleanCache parsed results
include_hash
booleanInclude file hashes
选项类型描述
normalize_timezone
布尔值是否统一转换为UTC时间
deduplicate
布尔值是否移除重复事件
parallel_parsing
布尔值是否并行解析来源
cache_results
布尔值是否缓存解析结果
include_hash
布尔值是否包含文件哈希

Examples

示例场景

Example 1: Incident Timeline Reconstruction

示例1:事件时间线重建

Scenario: Reconstructing attack timeline from evidence
python
from timeline_forensics import TimelineBuilder, TimelineAnalyzer
场景:从证据中重建攻击时间线
python
from timeline_forensics import TimelineBuilder, TimelineAnalyzer

Build comprehensive timeline

Build comprehensive timeline

builder = TimelineBuilder(case_id="INCIDENT-001") builder.add_disk_image("/evidence/victim.E01") builder.add_memory_dump("/evidence/memory.raw") builder.add_logs("/evidence/logs/") builder.add_pcap("/evidence/traffic.pcap")
timeline = builder.build()
builder = TimelineBuilder(case_id="INCIDENT-001") builder.add_disk_image("/evidence/victim.E01") builder.add_memory_dump("/evidence/memory.raw") builder.add_logs("/evidence/logs/") builder.add_pcap("/evidence/traffic.pcap")
timeline = builder.build()

Analyze for attack indicators

Analyze for attack indicators

analyzer = TimelineAnalyzer(timeline)
analyzer = TimelineAnalyzer(timeline)

Find initial compromise

Find initial compromise

initial = analyzer.find_events_with_keywords(["powershell", "cmd.exe"]) print(f"Potential initial access: {len(initial)}")
initial = analyzer.find_events_with_keywords(["powershell", "cmd.exe"]) print(f"Potential initial access: {len(initial)}")

Find lateral movement

Find lateral movement

lateral = analyzer.find_events_by_pattern("network_login")
lateral = analyzer.find_events_by_pattern("network_login")

Build attack narrative

Build attack narrative

narrative = analyzer.build_narrative() print(narrative)
undefined
narrative = analyzer.build_narrative() print(narrative)
undefined

Example 2: Data Breach Timeline

示例2:数据泄露时间线

Scenario: Creating timeline for data exfiltration investigation
python
from timeline_forensics import TimelineBuilder, TimelineCorrelator
场景:为数据泄露调查创建时间线
python
from timeline_forensics import TimelineBuilder, TimelineCorrelator

Build timeline

Build timeline

builder = TimelineBuilder(case_id="BREACH-001") builder.add_disk_image("/evidence/server.E01") builder.add_logs("/evidence/access_logs/")
timeline = builder.build()
builder = TimelineBuilder(case_id="BREACH-001") builder.add_disk_image("/evidence/server.E01") builder.add_logs("/evidence/access_logs/")
timeline = builder.build()

Find data access

Find data access

correlator = TimelineCorrelator(timeline) data_access = correlator.correlate_by_path("\SensitiveData\")
correlator = TimelineCorrelator(timeline) data_access = correlator.correlate_by_path("\SensitiveData\")

Find large file operations

Find large file operations

large_ops = correlator.find_large_file_operations(min_size_mb=10)
large_ops = correlator.find_large_file_operations(min_size_mb=10)

Generate breach timeline

Generate breach timeline

correlator.generate_breach_report("/evidence/breach_timeline.html")
undefined
correlator.generate_breach_report("/evidence/breach_timeline.html")
undefined

Limitations

局限性

  • Large timelines require significant memory
  • Timezone handling requires accurate source metadata
  • Some artifacts lack precise timestamps
  • Correlation accuracy depends on time synchronization
  • Visualization performance degrades with many events
  • Gap analysis assumes continuous activity
  • Pattern detection requires sufficient data
  • 大型时间线需要大量内存
  • 时区处理依赖准确的来源元数据
  • 部分工件缺乏精确时间戳
  • 关联准确性取决于时间同步
  • 可视化性能会随事件数量增加而下降
  • 间隙分析假设活动是连续的
  • 模式检测需要足够的数据量

Troubleshooting

故障排除

Common Issue 1: Memory Exhaustion

常见问题1:内存耗尽

Problem: Out of memory processing large timeline Solution:
  • Process in time chunks
  • Filter before loading
  • Increase system memory
问题:处理大型时间线时出现内存不足 解决方案:
  • 按时间分段处理
  • 加载前先过滤数据
  • 增加系统内存

Common Issue 2: Timezone Confusion

常见问题2:时区混乱

Problem: Events appear at wrong times Solution:
  • Verify source timezones
  • Check DST handling
  • Normalize all to UTC
问题:事件显示时间错误 解决方案:
  • 验证来源时区
  • 检查夏令时处理
  • 将所有时间统一转换为UTC

Common Issue 3: Missing Events

常见问题3:事件缺失

Problem: Expected events not in timeline Solution:
  • Verify parser support
  • Check source integrity
  • Review parser logs
问题:预期事件未出现在时间线中 解决方案:
  • 验证解析器支持情况
  • 检查来源完整性
  • 查看解析器日志

Related Skills

相关工具

  • memory-forensics: Add memory artifacts to timeline
  • disk-forensics: Add disk artifacts to timeline
  • log-forensics: Add log events to timeline
  • network-forensics: Add network events to timeline
  • artifact-collection: Collect artifacts for timeline
  • memory-forensics: 向时间线添加内存工件
  • disk-forensics: 向时间线添加磁盘工件
  • log-forensics: 向时间线添加日志事件
  • network-forensics: 向时间线添加网络事件
  • artifact-collection: 为时间线收集工件

References

参考资料

  • Timeline Forensics Reference
  • Plaso Integration Guide
  • Timeline Analysis Techniques
  • Timeline Forensics Reference
  • Plaso Integration Guide
  • Timeline Analysis Techniques