Overview
Features
Core Fabric Operations
Workspace, lakehouse, warehouse, and table management; Delta table schemas and metadata retrieval; SQL query execution; reports and semantic model operations.
Intelligent PySpark Notebook Creation
Automatic notebook creation with six specialized templates to accelerate PySpark development.
Smart PySpark Code Generation
Generate code for common PySpark operations to expedite development.
Code Validation and Best Practices
Comprehensive validation with syntax and best practices checks for PySpark and Fabric code.
Fabric-specific Optimizations
Fabric-oriented optimizations and compatibility checks to maximize performance.
Performance Analysis and Recommendations
Performance scoring and optimization recommendations for notebooks and workflows.
Real-time Monitoring and Insights
Real-time monitoring and execution insights for PySpark workloads.
LLM-based Natural Language Interface
Natural language interface with context-aware assistance, code formatting, explanations, and smart optimization suggestions.
Who Is This For?
- Data Engineers:Build and manage Fabric workspaces, lakehouses, warehouses, and tables; use LLM-assisted code generation and validation.
- Data Scientists:Prototype PySpark data processing and analytics workflows within Fabric with notebook templates and optimization patterns.
- AI/LLM-enabled Developers:Interact with Fabric APIs through natural language, leveraging context and reasoning for Fabric-aligned code.




