inference 0.1.0-beta.5
inference: ^0.1.0-beta.5 copied to clipboard
Zero-setup ML inference for Flutter using Rust engines (Candle, ONNX, Linfa). Load PyTorch, ONNX models, train on-device with unified API. Cross-platform support.
0.1.0-beta.5 #
- Improved pub.flutter-io.cn score with comprehensive documentation and code quality fixes
- Added detailed library documentation with examples and feature overview
- Documented all missing constructors and classes for 100% API coverage
- Fixed static analysis issues including unused imports and style improvements
- Added package topics for better discoverability
- Enhanced code formatting and consistency
0.1.0-beta.2 #
- Bug fixes and improvements for the second beta release
- Enhanced stability and performance optimizations
- Improved error handling and documentation
- Updated dependencies and build configurations
0.1.0-beta.1 #
- Initial beta release of the Inference Flutter package
- Zero-setup ML inference with unified API for Candle and Linfa engines
- Support for automatic model format detection (.safetensors, .pt, .pth)
- Cross-platform support (Android, iOS, Windows, macOS, Linux)
- Multiple input types: ImageInput, NLPInput, TensorInput, AudioInput
- Comprehensive example app with image classification, text sentiment, and on-device training
- Built with Flutter Rust Bridge 2.0 for optimal performance
- GPU acceleration support where available
0.0.1 #
- TODO: Describe initial release.