clanker 1 mēnesi atpakaļ
revīzija
200b1b596c

+ 4 - 0
.gitignore

@@ -0,0 +1,4 @@
+/test_out*
+/test_err*
+__pycache__
+*.egg-info

+ 135 - 0
README.md

@@ -0,0 +1,135 @@
+# Autusm - Automatic USM Generator
+
+Autusm is a command-line utility that automatically generates USM (Universal Source Manifest) files from source packages. It analyzes source code, detects build systems, extracts metadata, and creates USM-compatible scripts for acquiring, building, and installing packages.
+
+## Features
+
+- **Multi-format Support**: Downloads and extracts tar, tar.gz, tar.bz2, and zip archives
+- **Build System Detection**: Automatically detects autotools, CMake, Meson, Make, Python, Cargo, and NPM build systems
+- **Metadata Extraction**: Extracts package information from package.json, setup.py, Cargo.toml, and other configuration files
+- **USM Integration**: Integrates with USM package manager for autoprovides
+- **Interactive Mode**: Prompts for missing information in interactive mode
+- **Script Generation**: Generates USM-compatible acquire, build, and install scripts
+
+## Installation
+
+### From PyPI
+
+```bash
+pip install autusm
+```
+
+### From Source
+
+```bash
+git clone https://github.com/autusm/autusm.git
+cd autusm
+pip install -e .
+```
+
+## Usage
+
+### Basic Usage
+
+```bash
+autusm https://example.com/source.tar.gz
+```
+
+### Advanced Usage
+
+```bash
+autusm https://example.com/source.tar.gz \
+  --output-dir ./my-package \
+  --name my-package \
+  --version 1.0.0 \
+  --summary "My awesome package" \
+  --verbose
+```
+
+### Options
+
+- `URL`: URL to source archive (required)
+- `-o, --output-dir`: Output directory for generated files (default: current directory)
+- `-w, --work-dir`: Working directory for temporary files
+- `-n, --name`: Override package name
+- `-v, --version`: Override package version
+- `-s, --summary`: Override package summary
+- `--non-interactive`: Run in non-interactive mode
+- `--verbose`: Enable verbose output
+- `--quiet`: Suppress non-error output
+- `--skip-usm-check`: Skip USM availability check
+
+## Supported Build Systems
+
+- **Autotools**: Projects using configure scripts and Makefile.am
+- **CMake**: Projects with CMakeLists.txt
+- **Meson**: Projects with meson.build
+- **Make**: Projects with Makefile
+- **Python**: Projects with setup.py or pyproject.toml
+- **Cargo**: Rust projects with Cargo.toml
+- **NPM**: Node.js projects with package.json
+
+## Supported Metadata Formats
+
+- **Python**: setup.py, pyproject.toml, setup.cfg
+- **Node.js**: package.json
+- **Rust**: Cargo.toml
+- **PHP**: composer.json
+- **Java**: pom.xml, build.gradle
+- **Ruby**: Gemfile, *.gemspec
+- **Perl**: Makefile.PL, META.json, META.yml
+
+## Output Files
+
+Autusm generates the following files in the output directory:
+
+- `MANIFEST.usm`: USM manifest file in JSON format
+- `scripts/acquire`: Script to acquire source code
+- `scripts/build`: Script to build the package
+- `scripts/install`: Script to install the package
+
+## Examples
+
+### Processing a Python Package
+
+```bash
+autusm https://github.com/python/cpython/archive/refs/tags/v3.11.0.tar.gz
+```
+
+### Processing a CMake Project
+
+```bash
+autusm https://github.com/Kitware/CMake/archive/v3.25.1.tar.gz
+```
+
+### Non-Interactive Mode
+
+```bash
+autusm https://example.com/source.tar.gz \
+  --name my-package \
+  --version 1.0.0 \
+  --summary "My package summary" \
+  --non-interactive
+```
+
+## USM Integration
+
+If USM is installed on your system, autusm will automatically:
+
+1. Run `usm manifest autoprovides` to detect provided resources
+2. Merge autoprovides into the generated manifest
+3. Validate the generated manifest with USM
+
+## Contributing
+
+Contributions are welcome! Please see the [Contributing Guidelines](CONTRIBUTING.md) for details.
+
+## License
+
+Autusm is released under the MIT License. See the [LICENSE](LICENSE) file for details.
+
+## Support
+
+- **Documentation**: [https://autusm.org/docs](https://autusm.org/docs)
+- **Issues**: [https://github.com/autusm/autusm/issues](https://github.com/autusm/autusm/issues)
+- **Discussions**: [https://github.com/autusm/autusm/discussions](https://github.com/autusm/autusm/discussions)

+ 138 - 0
USM-SPEC.md

@@ -0,0 +1,138 @@
+# USM Manifest JSON Schema Specification
+
+## Overview
+
+The USM (Universal Source Manifest) is a JSON-based format that defines the structure and metadata of a software package. It provides all necessary information for building, installing, and managing software packages across different systems.
+
+The manifest file must be named named `MANIFEST.usm` and is located at the root of a software project.
+
+## Schema Structure
+
+The USM manifest file is contains a JSON object with the following properties:
+
+- `name` (string, required): The name of the software package (no spaces allowed).
+- `version` (string, required): The semantic version of the package, with an optional package version separated by a `+`, e.g.: 
+    - `"1.1.2"` for a regular release.
+    - `"1.1.2+1` for the same version, but with a fix to the packaging.
+- `summary` (string, required): A short summary describing the package, e.g. `Universal Source Manifest` for the package `usm`.
+- `licences` (array, required): A list of licences applicable to this package, the array contains a license object with thr following required properties:
+    - `name` (string, required): The name of the licence.
+    - `text` (string, required): The path, relative to the MANIFEST.usm of the full text of the licence.
+    - `category` (string, required): Must be one of:
+        - `"libre"`: for software that meets the [four essential freedoms](https://www.gnu.org/philosophy/free-sw.en.html#fs-definition), that being (from gnu.org):
+            - The freedom to run the program as you wish, for any purpose (freedom 0).
+            - The freedom to study how the program works, and change it so it does your computing as you wish (freedom 1). Access to the source code is a precondition for this.
+            - The freedom to redistribute copies so you can help others (freedom 2).
+            - The freedom to distribute copies of your modified versions to others (freedom 3). By doing this you can give the whole community a chance to benefit from your changes. Access to the source code is a precondition for this.
+        - `"open-source"`: for software which may be regarded as "open source" but fall short of the free software definition. In cases where an "open-source" licence *does* meet all the criteria to be considered free/libre software (by the above definition), `"libre"` must be used instead. Examples of licences considered open-source but not libre include:
+            - The [NASA Open Source Agreement v1.3](https://directory.fsf.org/wiki/License:NASA-1.3) is not recognised as a "libre" licence because it includes a provision requiring changes to be your “original creation”, conflicting with "freedom 1" since you cannot make changes that incorporate intellectual property that may be under a permissive licence.
+            - The [Artistic-1.0](https://directory.fsf.org/wiki/License:Artistic-1.0) licence is too vague and might not protect the user's freedoms.
+        - `"source-available"`: for software that makes the full source code available, but has restrictions on what the user can do with the software, or the source code. Examples include:
+            - The [Microsoft Shared Source CLI, C#, and Jscript License](https://directory.fsf.org/wiki/License:Ms-SS) which does not permit commercial distribution.
+            - The [The Anti-Capitalist Software License (v 1.4)](https://directory.fsf.org/wiki/License:ANTI-1.4) which places restriction on the types of users or organisations that may make use of the software.
+            - The [JSON License](https://directory.fsf.org/wiki/License:JSON) which mandates that "The Software shall be used for Good, not Evil", thereby violating freedom 0 above.
+        - `"proprietary"`: for everything else.
+- `provides` (object, required): An dictionary object containting properties describing the resources that this software package provides:
+    - The property name must be a [ResourceRef](#resource-references) string describing a resource that this package provides.
+    - The value may be either:
+        - An object with the following properties:
+            - `path` (string): The path to the resource file, relative to `pathBase`. Required when `type` is `"reg"` and `pathBase` is not `"as-expected"`, must not be set otherwise.
+            - `pathBase` (string): What the `path` property is relative to (see [Installation Lifecycle Directories](#installation-lifecycle-directories)). It is required when `type` is `"reg"` and must not be set otherwise. It can be set to one of the following:
+                - `"source"`: referencing the path to the extracted source code.
+                - `"build"`: referencing the path to the built output (e.g. where the result of `make` might go).
+                - `"install"`: referencing the path to the install output (e.g. where the result of `make install` might go).
+                - `"as-expected"` which will copy the file from the install output (e.g. where the result of `make install` might go) in the place expected for a resource of the type and name specified by the [ResourceRef](#resource-references). When in use, `path` must be an empty string.
+            - `type` (string, required): The type of filesystem entry for this resource, can be one of the following:
+                - `"reg"`: For regular files.
+                - `"dir"`: For making directories.
+                - `"lnk"`: For symbolic links.
+            - `dest` (string): The destination for the symbolic link. Required when `type` is `"lnk"` and must not be set otherwise.
+            - `keepOn` (string array): For setting restrictions on deleting this resource, each string in the array can be one of the following:
+                - `"final"`: To inhibit deletion when the package is being uninstalled at the request of the user.
+                - `"upgrade"`: To inhibit deletion of the resource when the package is being uninstalled to make way for an updated version.
+                - `"downgrade"`: To inhibit deletion of the resource when the package is being uninstalled to make way for an older version.
+            - `skipFor` (sting array): For setting restrictions on installing this resource, each string in the array can be one of the following:
+                - `"fresh"`: To inhibit installation of the resource when the package is being installed for the first time.
+                - `"upgrade"`: To inhibit the installation of the resource when the package is being updated to a newer version.
+                - `"downgrade"`: To inhibit the installation of the resource when the package is being downgraded to an older version.
+        - A shorthand string in the format `[path-base]:[path]` describing the file path to the resource so that it can be copied on installation (see `pathBase` and `path` properties above).
+        - The literal string `"as-expected"` (sans colon) which has the same meaning as `{"pathBase": "as-expected"}` above.
+- `depends` (object, required): An object with the following properties describing different kinds of package dependencies:
+    - `runtime` (string array, required): Each string must be a [ResourceRef](#resource-references) describing the resources this package needs at runtime.
+    - `build` (string array, required): Each string must be a [ResourceRef](#resource-references) describing the resources this package needs at compile (build) time.
+    - `manage` (string arraym required): Each string must be a [ResourceRef](#resource-references) describing the resources needed to run the processes specified in the `execs` section of the manifest (other than `acquire`).
+    - `acquire` (string array): Each string must be a [ResourceRef](#resource-references) describing the resources this package needs to run the acquire script (when specified on the `acquire` property of the `execs` object).
+- `flags` (string array, required): A list of flags for this manifest, each string in the array can be one of the following:
+    - `"buildInSourceTree"` to tell usm to not create a separate build directory, and compile the package "in place".
+    - `"setManifestPropertyEnvs"` idk TODO
+- `execs` (object, required): An object describing executable scripts for different phases of the package lifecycle (see [Executable Scripts](#executable-scripts)). It has the following properties :
+    - `build` (string, required): Path to the build script relative to the package root.
+    - `install` (string, optional): Path to the install script relative to the package root.
+    - `remove` (string, optional): Path to the removal script relative to the package root.
+    - `postInstall` (string, optional): Path to the post-install script relative to the package root.
+    - `acquire` (string, optional): Path to the acquire script relative to the package root.
+- `md` (string, optional): Path to a markdown file containing a detailed description of the package, relative to the package root.
+- `url` (string, optional): URL to the project's website or repository.
+- `screenshots` (string array, optional): Array of paths to screenshot files, relative to the package root.
+- `icon` (string, optional): Path to an icon file for the package, relative to the package root.
+- `metainfo` (string, optional): Path to a metainfo file following the AppStream specification, relative to the package root.
+- `git` (object, optional): Git repository information with the following properties:
+    - `origin` (string, required): URL to the git repository.
+    - `commit` (string, required): Specific commit hash, tag, or branch to use.
+- `extras` (object, optional): Additional properties that don't fit into the standard schema.
+
+## Resource References
+
+Resource references are strings in the format `[resource-type]:[resource-name]` that identify specific resources provided or required by a package. The resource type determines where the resource is located or expected to be found on the filesystem.
+
+### Resource Types
+
+The following resource types are defined:
+
+- `rootpath`: A resource located at the root of the filesystem (`/[resource-name]`).
+- `path`: A resource located in the `/usr` hierarchy (`/usr/[resource-name]`).
+- `opt`: A resource located in the `/opt` hierarchy (`/opt/[resource-name]`).
+- `res`: A resource located in the `/usr/share` hierarchy (`/usr/share/[resource-name]`).
+- `cfg`: A resource located in the `/etc` hierarchy (`/etc/[resource-name]`).
+- `bin`: An executable binary located in the system PATH.
+- `sbin`: A system executable binary located in `/usr/sbin` or `/sbin`.
+- `lib`: A shared library located in the system library paths.
+- `libexec`: A library executable located in `/usr/libexec`.
+- `libres`: A library resource located in `/usr/lib`, `/usr/lib64`, `/lib`, or `/lib64`.
+- `info`: An info page located in `/usr/share/info`.
+- `man`: A manual page located in `/usr/share/man`.
+- `locale`: A locale resource located in `/usr/share/locale`.
+- `app`: An application desktop file located in `/usr/share/applications`.
+- `inc`: A C/C++ header file located in `/usr/include`.
+- `pc`: A pkg-config file located in the system pkg-config paths.
+- `vapi`: A Vala API file located in `/usr/share/vala/vapi` or `/usr/share/vala-0.56/vapi`.
+- `gir`: A GObject Introspection file located in `/usr/share/gir`.
+- `typelib`: A GObject typelib file located in `/usr/lib64/girepository-1.0`, `/usr/lib/girepository-1.0`, `/lib64/girepository-1.0`, or `/lib/girepository-1.0`.
+- `tag`: A tag resource managed by the USM system.
+
+### Example Resource References
+
+- `bin:ls`: The `ls` executable binary.
+- `lib:libc.so.6`: The libc shared library.
+- `man:ls.1`: The manual page for the `ls` command.
+- `cfg:myapp.conf`: A configuration file for `myapp`.
+- `app:myapp.desktop`: A desktop entry for `myapp`.
+
+## Installation Lifecycle Directories
+
+Each of these directories are created by USM at runtime for each package.
+
+- The source directory (`"pathBase": "source"`) is where the source code is extracted to (in the case of a `.usmc` package), or acquired into (when using an `acquire` exec).
+- The build directory (`"pathBase": "build"`) is where the `build` exec is instructed to output the built program to.
+- The install directory (`"pathBase": "install"`) is where the `install` exec (if applicable) is instructed to install the program to (i.e. the `DESTDIR` in most build systems).
+
+
+## Executable Scripts
+
+All executable scripts except for `remove` and `postInstall` are run from the source directory (see above). They may be anywhere in the source package, but are often placed in a folder at the root of the package called `scripts`.
+
+- `build` is the only required exec script, it has the needed commands to actually compile the program from source. The build directory is passed to this exec as the first argument.
+- `install` when specified will install the files into the install directory, in the same structure that it would if that directory were the filesystem root. The build directory is passed to this exec as the first argument, the install directory is passed as the second argument, and the install type ("fresh", "upgrade", "downgrade") is passed as the third argument.
+- `remove` is called just before USM deletes the resources of a package on uninstallation. The remove type ("final", "upgrade", "downgrade") is passed as the first argument.
+- `postInstall` is called after USM has installed the packages resources to the system. The build directory is passed to this exec as the first argument, and the install type ("fresh", "upgrade", "downgrade") is passed as the second argument.
+- `acquire` is used to download the authoritative sources. It is usually used by package maintainers when a project does not natively have `MANIFEST.usm` support to enable putting many USM manifests into a repository, and using tooling to download and extract the sources to make `.usmc` (USM Complete) packages. This script usually consists of either a `git clone` command, or a `wget` call followed by an archive extraction command. No parameters are passed to this exec.

+ 80 - 0
example_usage.py

@@ -0,0 +1,80 @@
+#!/usr/bin/env python3
+"""
+Example usage of autusm library without CLI.
+"""
+
+import sys
+import tempfile
+from pathlib import Path
+
+# Add the src directory to the path so we can import autusm
+sys.path.insert(0, str(Path(__file__).parent / "src"))
+
+from autusm.models import PackageInfo, BuildSystem, BuildSystemType
+from autusm.generator import ScriptGenerator
+from autusm.manifest import ManifestGenerator
+
+
+def main():
+    """Demonstrate autusm usage."""
+    print("Autusm Library Usage Example")
+    print("=" * 40)
+    
+    # Create a sample package info
+    package_info = PackageInfo(
+        name="example-package",
+        version="1.0.0",
+        summary="An example package for demonstration",
+        description="This is an example package to demonstrate autusm functionality",
+        url="https://github.com/example/example-package",
+        authors=["Example Author"],
+        runtime_dependencies=["python3", "libexample"],
+        build_dependencies=["gcc", "make"]
+    )
+    
+    # Create a sample build system
+    build_system = BuildSystem(
+        type=BuildSystemType.CMAKE,
+        config_files=["CMakeLists.txt"],
+        build_files=["CMakeLists.txt"],
+        detected_commands=["cmake -DCMAKE_INSTALL_PREFIX=/usr .", "make", "make install"],
+        custom_args={"BUILD_TESTING": "ON"}
+    )
+    
+    print(f"Package: {package_info.name} v{package_info.version}")
+    print(f"Summary: {package_info.summary}")
+    print(f"Build System: {build_system.type.value}")
+    
+    # Generate scripts
+    with tempfile.TemporaryDirectory() as temp_dir:
+        scripts_dir = Path(temp_dir) / "scripts"
+        
+        generator = ScriptGenerator()
+        generator.generate_scripts(package_info, build_system, scripts_dir)
+        
+        print("\nGenerated Scripts:")
+        for script_file in ["acquire", "build", "install"]:
+            script_path = scripts_dir / script_file
+            if script_path.exists():
+                print(f"  - {script_path}")
+                
+                # Show first few lines of each script
+                with open(script_path, 'r') as f:
+                    lines = f.readlines()[:5]
+                    print("    First 5 lines:")
+                    for i, line in enumerate(lines, 1):
+                        print(f"      {i}: {line.rstrip()}")
+                    print("      ...")
+    
+    # Generate manifest
+    manifest_generator = ManifestGenerator()
+    manifest = manifest_generator.generate(package_info, build_system)
+    
+    print("\nGenerated Manifest (JSON):")
+    print(manifest.to_json())
+    
+    print("\nExample completed successfully!")
+
+
+if __name__ == "__main__":
+    main()

+ 53 - 0
pyproject.toml

@@ -0,0 +1,53 @@
+[build-system]
+requires = ["setuptools>=61.0", "wheel"]
+build-backend = "setuptools.build_meta"
+
+[project]
+name = "autusm"
+version = "0.1.0"
+description = "Automatic USM (Universal Source Manifest) generator for source packages"
+authors = [{name = "Autusm Team", email = "team@autusm.org"}]
+license = {text = "MIT"}
+readme = "README.md"
+requires-python = ">=3.8"
+classifiers = [
+    "Development Status :: 3 - Alpha",
+    "Intended Audience :: Developers",
+    "License :: OSI Approved :: MIT License",
+    "Programming Language :: Python :: 3",
+    "Programming Language :: Python :: 3.8",
+    "Programming Language :: Python :: 3.9",
+    "Programming Language :: Python :: 3.10",
+    "Programming Language :: Python :: 3.11",
+    "Programming Language :: Python :: 3.12",
+]
+dependencies = [
+    "requests>=2.25.0",
+    "click>=8.0.0",
+    "pyyaml>=6.0",
+    "toml>=0.10.2",
+    "packaging>=21.0",
+]
+
+[project.optional-dependencies]
+dev = [
+    "pytest>=6.0",
+    "pytest-cov>=2.0",
+    "black>=21.0",
+    "flake8>=3.9",
+    "mypy>=0.910",
+]
+
+[project.scripts]
+autusm = "autusm.cli:main"
+
+[project.urls]
+Homepage = "https://github.com/autusm/autusm"
+Repository = "https://github.com/autusm/autusm"
+Issues = "https://github.com/autusm/autusm/issues"
+
+[tool.setuptools.packages.find]
+where = ["src"]
+
+[tool.setuptools.package-dir]
+"" = "src"

+ 15 - 0
src/autusm/__init__.py

@@ -0,0 +1,15 @@
+"""
+Autusm - Automatic USM (Universal Source Manifest) generator
+
+This package provides tools to automatically generate USM manifests
+from source packages by analyzing their structure, build systems,
+and metadata.
+"""
+
+__version__ = "0.1.0"
+__author__ = "Autusm Team"
+__email__ = "team@autusm.org"
+
+from .models import PackageInfo, BuildSystem, USMManifest
+
+__all__ = ["PackageInfo", "BuildSystem", "USMManifest"]

+ 391 - 0
src/autusm/analyzer.py

@@ -0,0 +1,391 @@
+"""
+Source analyzer for autusm.
+
+This module provides functionality to analyze source code and detect
+build systems, dependencies, and other project characteristics.
+"""
+
+import os
+import logging
+import re
+from pathlib import Path
+from typing import List, Dict, Optional, Set
+
+from .models import BuildSystem, BuildSystemType
+from .exceptions import AnalysisError
+
+
+logger = logging.getLogger(__name__)
+
+
+class SourceAnalyzer:
+    """Analyzer for source code and build systems."""
+
+    def __init__(self):
+        """Initialize the source analyzer."""
+        # Define patterns for build system detection
+        self.build_system_patterns = {
+            BuildSystemType.AUTOTOOLS: [
+                r"configure\.(ac|in)$",
+                r"Makefile\.am$",
+                r"autogen\.sh$",
+                r"bootstrap$"
+            ],
+            BuildSystemType.CMAKE: [
+                r"CMakeLists\.txt$",
+                r"\.cmake$"
+            ],
+            BuildSystemType.MESON: [
+                r"meson\.build$",
+                r"meson_options\.txt$"
+            ],
+            BuildSystemType.MAKE: [
+                r"Makefile$",
+                r"makefile$",
+                r"GNUmakefile$"
+            ],
+            BuildSystemType.PYTHON: [
+                r"setup\.py$",
+                r"pyproject\.toml$",
+                r"requirements\.txt$"
+            ],
+            BuildSystemType.CARGO: [
+                r"Cargo\.toml$",
+                r"Cargo\.lock$"
+            ],
+            BuildSystemType.NPM: [
+                r"package\.json$",
+                r"package-lock\.json$",
+                r"npm-shrinkwrap\.json$"
+            ]
+        }
+
+        # Define patterns for dependency detection
+        self.dependency_patterns = {
+            "c": [
+                r"#include\s+[<\"](.*\.h)[>\"]",
+                r"#include\s+[<\"](.*\.hpp)[>\"]"
+            ],
+            "cpp": [
+                r"#include\s+[<\"](.*\.h)[>\"]",
+                r"#include\s+[<\"](.*\.hpp)[>\"]"
+            ],
+            "python": [
+                r"import\s+([a-zA-Z_][a-zA-Z0-9_]*)",
+                r"from\s+([a-zA-Z_][a-zA-Z0-9_]*)\s+import"
+            ],
+            "rust": [
+                r"use\s+([a-zA-Z_][a-zA-Z0-9_]*::)",
+                r"extern\s+crate\s+([a-zA-Z_][a-zA-Z0-9_]*)"
+            ],
+            "javascript": [
+                r"require\s*\(\s*['\"]([^'\"]+)['\"]",
+                r"import\s+.*\s+from\s+['\"]([^'\"]+)['\"]"
+            ]
+        }
+
+    def detect_build_system(self, source_dir: Path) -> BuildSystem:
+        """Detect the build system used by a project.
+        
+        Args:
+            source_dir: Path to the source directory
+            
+        Returns:
+            BuildSystem object with detected information
+            
+        Raises:
+            AnalysisError: If analysis fails
+        """
+        try:
+            logger.info(f"Analyzing build system in {source_dir}")
+            
+            # Initialize result
+            detected_type = BuildSystemType.UNKNOWN
+            config_files = []
+            build_files = []
+            detected_commands = []
+            custom_args = {}
+            
+            # Walk through the source directory
+            for root, dirs, files in os.walk(source_dir):
+                # Skip hidden directories and common build directories
+                dirs[:] = [d for d in dirs if not d.startswith('.') and d not in ['build', 'target', 'node_modules', '__pycache__']]
+                
+                for file in files:
+                    file_path = Path(root) / file
+                    relative_path = file_path.relative_to(source_dir)
+                    
+                    # Check against build system patterns
+                    for build_type, patterns in self.build_system_patterns.items():
+                        for pattern in patterns:
+                            if re.match(pattern, file, re.IGNORECASE):
+                                if detected_type == BuildSystemType.UNKNOWN:
+                                    detected_type = build_type
+                                elif detected_type != build_type:
+                                    # Multiple build systems detected, prefer more specific ones
+                                    detected_type = self._resolve_build_system_conflict(detected_type, build_type)
+                                
+                                config_files.append(str(relative_path))
+                                break
+                    
+                    # Look for common build files
+                    if file.lower() in ["makefile", "gnufile"]:
+                        build_files.append(str(relative_path))
+                    elif file.lower() == "cmakelists.txt":
+                        build_files.append(str(relative_path))
+            
+            # Get detected commands based on build system
+            detected_commands = self._get_build_commands(detected_type, source_dir)
+            
+            # Get custom arguments based on build system
+            custom_args = self._get_custom_args(detected_type, source_dir)
+            
+            build_system = BuildSystem(
+                type=detected_type,
+                config_files=list(set(config_files)),
+                build_files=list(set(build_files)),
+                detected_commands=detected_commands,
+                custom_args=custom_args
+            )
+            
+            logger.info(f"Detected build system: {detected_type.value}")
+            return build_system
+            
+        except Exception as e:
+            logger.error(f"Failed to analyze build system: {e}")
+            raise AnalysisError(f"Failed to analyze build system: {e}")
+
+    def _resolve_build_system_conflict(self, current: BuildSystemType, new: BuildSystemType) -> BuildSystemType:
+        """Resolve conflicts when multiple build systems are detected.
+        
+        Args:
+            current: Currently detected build system
+            new: Newly detected build system
+            
+        Returns:
+            The preferred build system
+        """
+        # Define priority order (higher number = higher priority)
+        priority = {
+            BuildSystemType.AUTOTOOLS: 4,
+            BuildSystemType.CMAKE: 5,
+            BuildSystemType.MESON: 6,
+            BuildSystemType.MAKE: 3,
+            BuildSystemType.PYTHON: 2,
+            BuildSystemType.CARGO: 7,
+            BuildSystemType.NPM: 8,
+            BuildSystemType.UNKNOWN: 1
+        }
+        
+        return current if priority[current] >= priority[new] else new
+
+    def _get_build_commands(self, build_type: BuildSystemType, source_dir: Path) -> List[str]:
+        """Get the typical build commands for a build system.
+        
+        Args:
+            build_type: Type of build system
+            source_dir: Path to source directory
+            
+        Returns:
+            List of build commands
+        """
+        commands = {
+            BuildSystemType.AUTOTOOLS: [
+                "./configure --prefix=/usr",
+                "make",
+                "make install"
+            ],
+            BuildSystemType.CMAKE: [
+                "cmake -DCMAKE_INSTALL_PREFIX=/usr .",
+                "make",
+                "make install"
+            ],
+            BuildSystemType.MESON: [
+                "meson setup builddir --prefix=/usr",
+                "meson compile -C builddir",
+                "meson install -C builddir"
+            ],
+            BuildSystemType.MAKE: [
+                "make",
+                "make install"
+            ],
+            BuildSystemType.PYTHON: [
+                "python setup.py build",
+                "python setup.py install"
+            ],
+            BuildSystemType.CARGO: [
+                "cargo build --release",
+                "cargo install --path ."
+            ],
+            BuildSystemType.NPM: [
+                "npm install",
+                "npm run build"
+            ]
+        }
+        
+        return commands.get(build_type, [])
+
+    def _get_custom_args(self, build_type: BuildSystemType, source_dir: Path) -> Dict[str, str]:
+        """Get custom arguments for a build system based on project characteristics.
+        
+        Args:
+            build_type: Type of build system
+            source_dir: Path to source directory
+            
+        Returns:
+            Dictionary of custom arguments
+        """
+        args = {}
+        
+        if build_type == BuildSystemType.CMAKE:
+            # Check for common CMake options
+            if (source_dir / "tests").exists():
+                args["BUILD_TESTING"] = "ON"
+            
+            # Check for documentation
+            if (source_dir / "docs").exists():
+                args["BUILD_DOCS"] = "ON"
+                
+        elif build_type == BuildSystemType.AUTOTOOLS:
+            # Check for common autotools options
+            if (source_dir / "tests").exists():
+                args["enable_tests"] = "--enable-tests"
+                
+        return args
+
+    def analyze_dependencies(self, source_dir: Path) -> Dict[str, List[str]]:
+        """Analyze dependencies in the source code.
+        
+        Args:
+            source_dir: Path to the source directory
+            
+        Returns:
+            Dictionary mapping file types to lists of dependencies
+            
+        Raises:
+            AnalysisError: If analysis fails
+        """
+        try:
+            dependencies = {}
+            
+            # Walk through the source directory
+            for root, dirs, files in os.walk(source_dir):
+                # Skip hidden directories and common build directories
+                dirs[:] = [d for d in dirs if not d.startswith('.') and d not in ['build', 'target', 'node_modules', '__pycache__']]
+                
+                for file in files:
+                    file_path = Path(root) / file
+                    file_ext = file_path.suffix.lower()
+                    
+                    # Determine file type
+                    if file_ext in ['.c', '.h']:
+                        file_type = "c"
+                    elif file_ext in ['.cpp', '.cxx', '.cc', '.hpp']:
+                        file_type = "cpp"
+                    elif file_ext == '.py':
+                        file_type = "python"
+                    elif file_ext == '.rs':
+                        file_type = "rust"
+                    elif file_ext in ['.js', '.jsx', '.ts', '.tsx']:
+                        file_type = "javascript"
+                    else:
+                        continue
+                    
+                    # Extract dependencies
+                    if file_type not in dependencies:
+                        dependencies[file_type] = set()
+                    
+                    file_dependencies = self._extract_file_dependencies(file_path, file_type)
+                    dependencies[file_type].update(file_dependencies)
+            
+            # Convert sets to lists
+            return {k: list(v) for k, v in dependencies.items()}
+            
+        except Exception as e:
+            logger.error(f"Failed to analyze dependencies: {e}")
+            raise AnalysisError(f"Failed to analyze dependencies: {e}")
+
+    def _extract_file_dependencies(self, file_path: Path, file_type: str) -> Set[str]:
+        """Extract dependencies from a single file.
+        
+        Args:
+            file_path: Path to the file
+            file_type: Type of the file
+            
+        Returns:
+            Set of dependencies
+        """
+        dependencies = set()
+        
+        try:
+            with open(file_path, 'r', encoding='utf-8', errors='ignore') as f:
+                content = f.read()
+                
+            # Apply patterns for the file type
+            if file_type in self.dependency_patterns:
+                for pattern in self.dependency_patterns[file_type]:
+                    matches = re.findall(pattern, content)
+                    dependencies.update(matches)
+                    
+        except Exception as e:
+            logger.warning(f"Failed to extract dependencies from {file_path}: {e}")
+            
+        return dependencies
+
+    def analyze_project_structure(self, source_dir: Path) -> Dict[str, List[str]]:
+        """Analyze the project structure.
+        
+        Args:
+            source_dir: Path to the source directory
+            
+        Returns:
+            Dictionary with project structure information
+            
+        Raises:
+            AnalysisError: If analysis fails
+        """
+        try:
+            structure = {
+                "directories": [],
+                "source_files": [],
+                "config_files": [],
+                "documentation": [],
+                "tests": []
+            }
+            
+            # Walk through the source directory
+            for root, dirs, files in os.walk(source_dir):
+                # Skip hidden directories
+                dirs[:] = [d for d in dirs if not d.startswith('.')]
+                
+                root_path = Path(root)
+                relative_root = root_path.relative_to(source_dir)
+                
+                # Add directories
+                for dir_name in dirs:
+                    dir_path = root_path / dir_name
+                    relative_path = dir_path.relative_to(source_dir)
+                    structure["directories"].append(str(relative_path))
+                    
+                    # Categorize directories
+                    if dir_name in ['doc', 'docs', 'documentation']:
+                        structure["documentation"].append(str(relative_path))
+                    elif dir_name in ['test', 'tests', 'testing']:
+                        structure["tests"].append(str(relative_path))
+                
+                # Add files
+                for file in files:
+                    file_path = root_path / file
+                    relative_path = file_path.relative_to(source_dir)
+                    
+                    # Categorize files
+                    if file.endswith(('.c', '.cpp', '.cxx', '.cc', '.h', '.hpp')):
+                        structure["source_files"].append(str(relative_path))
+                    elif file in ['configure', 'CMakeLists.txt', 'meson.build', 'Makefile', 'makefile']:
+                        structure["config_files"].append(str(relative_path))
+            
+            return structure
+            
+        except Exception as e:
+            logger.error(f"Failed to analyze project structure: {e}")
+            raise AnalysisError(f"Failed to analyze project structure: {e}")

+ 218 - 0
src/autusm/cli.py

@@ -0,0 +1,218 @@
+"""
+Command line interface for autusm.
+
+This module provides the main CLI entry point for the autusm utility,
+handling argument parsing and coordinating the various components.
+"""
+
+import os
+import sys
+import logging
+from pathlib import Path
+from typing import Optional
+
+import click
+
+from .download import DownloadManager
+from .extractor import ArchiveExtractor
+from .analyzer import SourceAnalyzer
+from .metadata import MetadataExtractor
+from .generator import ScriptGenerator
+from .manifest import ManifestGenerator
+from .usm_integration import USMIntegration
+from .interaction import UserInteraction
+from .models import PackageInfo, USMManifest
+from .exceptions import AutusmError
+
+
+# Configure logging
+logging.basicConfig(
+    level=logging.INFO,
+    format="%(asctime)s - %(name)s - %(levelname)s - %(message)s"
+)
+logger = logging.getLogger(__name__)
+
+
+@click.command()
+@click.argument("url", required=True)
+@click.option(
+    "--output-dir", "-o",
+    type=click.Path(exists=False, file_okay=False, dir_okay=True),
+    default=".",
+    help="Output directory for generated files (default: current directory)"
+)
+@click.option(
+    "--work-dir", "-w",
+    type=click.Path(exists=False, file_okay=False, dir_okay=True),
+    help="Working directory for temporary files (default: system temp)"
+)
+@click.option(
+    "--name", "-n",
+    help="Override package name"
+)
+@click.option(
+    "--version", "-v",
+    help="Override package version"
+)
+@click.option(
+    "--summary", "-s",
+    help="Override package summary"
+)
+@click.option(
+    "--non-interactive",
+    is_flag=True,
+    default=False,
+    help="Run in non-interactive mode (fail on missing information)"
+)
+@click.option(
+    "--verbose", "-V",
+    is_flag=True,
+    default=False,
+    help="Enable verbose output"
+)
+@click.option(
+    "--quiet", "-q",
+    is_flag=True,
+    default=False,
+    help="Suppress non-error output"
+)
+@click.option(
+    "--skip-usm-check",
+    is_flag=True,
+    default=False,
+    help="Skip USM availability check and autoprovides"
+)
+def main(
+    url: str,
+    output_dir: str,
+    work_dir: Optional[str],
+    name: Optional[str],
+    version: Optional[str],
+    summary: Optional[str],
+    non_interactive: bool,
+    verbose: bool,
+    quiet: bool,
+    skip_usm_check: bool
+) -> None:
+    """
+    Generate USM manifest from source archive URL.
+    
+    URL is the URL to a source archive (tar, tar.gz, tar.bz2, zip).
+    """
+    # Configure logging level
+    if verbose:
+        logging.getLogger().setLevel(logging.DEBUG)
+    elif quiet:
+        logging.getLogger().setLevel(logging.ERROR)
+
+    try:
+        # Initialize components
+        download_manager = DownloadManager()
+        extractor = ArchiveExtractor()
+        analyzer = SourceAnalyzer()
+        metadata_extractor = MetadataExtractor()
+        script_generator = ScriptGenerator()
+        manifest_generator = ManifestGenerator()
+        usm_integration = USMIntegration()
+        user_interaction = UserInteraction(interactive=not non_interactive)
+
+        # Set up working directory
+        if work_dir:
+            work_path = Path(work_dir)
+            work_path.mkdir(parents=True, exist_ok=True)
+        else:
+            import tempfile
+            work_path = Path(tempfile.mkdtemp(prefix="autusm-"))
+
+        logger.info(f"Using working directory: {work_path}")
+
+        # Step 1: Download the source archive
+        logger.info(f"Downloading source archive from: {url}")
+        archive_path = download_manager.download(url, work_path)
+        logger.info(f"Downloaded to: {archive_path}")
+
+        # Step 2: Extract the archive
+        logger.info("Extracting archive...")
+        source_dir = extractor.extract(archive_path, work_path)
+        logger.info(f"Extracted to: {source_dir}")
+
+        # Step 3: Analyze the source code
+        logger.info("Analyzing source code...")
+        build_system = analyzer.detect_build_system(source_dir)
+        logger.info(f"Detected build system: {build_system.type.value}")
+
+        # Step 4: Extract metadata
+        logger.info("Extracting metadata...")
+        package_info = metadata_extractor.extract(source_dir)
+
+        # Override with command line arguments if provided
+        if name:
+            package_info.name = name
+        if version:
+            package_info.version = version
+        if summary:
+            package_info.summary = summary
+
+        # Set the source directory
+        package_info.source_dir = str(source_dir)
+
+        # Step 5: Fill in missing information interactively if needed
+        package_info = user_interaction.fill_missing_info(package_info)
+
+        # Step 6: Generate USM scripts
+        logger.info("Generating USM scripts...")
+        scripts_dir = Path(output_dir) / "scripts"
+        scripts_dir.mkdir(parents=True, exist_ok=True)
+        
+        script_generator.generate_scripts(
+            package_info,
+            build_system,
+            scripts_dir
+        )
+
+        # Step 7: Generate USM manifest
+        logger.info("Generating USM manifest...")
+        manifest = manifest_generator.generate(package_info, build_system)
+
+        # Step 8: Get autoprovides from USM if available
+        if not skip_usm_check and usm_integration.is_available():
+            logger.info("Getting autoprovides from USM...")
+            autoprovides = usm_integration.get_autoprovides(source_dir)
+            if autoprovides:
+                # Merge autoprovides into manifest
+                for resource_ref, resource in autoprovides.items():
+                    manifest.provides[resource_ref] = resource
+
+        # Step 9: Write the manifest file
+        output_path = Path(output_dir) / "MANIFEST.usm"
+        with open(output_path, "w") as f:
+            f.write(manifest.to_json())
+        
+        logger.info(f"USM manifest written to: {output_path}")
+        
+        # Print summary
+        if not quiet:
+            click.echo(f"\nSuccessfully generated USM manifest for {package_info.name}")
+            click.echo(f"Package: {package_info.name} v{package_info.version}")
+            click.echo(f"Summary: {package_info.summary}")
+            click.echo(f"Build System: {build_system.type.value}")
+            click.echo(f"Output files:")
+            click.echo(f"  - {output_path}")
+            click.echo(f"  - {scripts_dir}/ (USM scripts)")
+
+    except AutusmError as e:
+        logger.error(f"Error: {e}")
+        sys.exit(1)
+    except KeyboardInterrupt:
+        logger.info("Operation cancelled by user")
+        sys.exit(1)
+    except Exception as e:
+        logger.error(f"Unexpected error: {e}")
+        if verbose:
+            import traceback
+            traceback.print_exc()
+        sys.exit(1)
+
+
+if __name__ == "__main__":
+    main()

+ 157 - 0
src/autusm/download.py

@@ -0,0 +1,157 @@
+"""
+Download manager for autusm.
+
+This module provides functionality to download source archives from URLs
+with progress tracking and error handling.
+"""
+
+import os
+import logging
+import urllib.request
+import urllib.parse
+from pathlib import Path
+from typing import Optional
+import urllib.error
+
+import requests
+
+from .exceptions import DownloadError
+
+
+logger = logging.getLogger(__name__)
+
+
+class DownloadManager:
+    """Manager for downloading source archives from URLs."""
+
+    def __init__(self, timeout: int = 30, chunk_size: int = 8192):
+        """Initialize the download manager.
+        
+        Args:
+            timeout: Request timeout in seconds
+            chunk_size: Size of chunks to download at a time
+        """
+        self.timeout = timeout
+        self.chunk_size = chunk_size
+
+    def download(self, url: str, destination: Path) -> Path:
+        """Download a file from the given URL to the destination directory.
+        
+        Args:
+            url: URL to download from
+            destination: Directory to save the file to
+            
+        Returns:
+            Path to the downloaded file
+            
+        Raises:
+            DownloadError: If the download fails
+        """
+        try:
+            # Parse URL to get filename
+            parsed_url = urllib.parse.urlparse(url)
+            filename = os.path.basename(parsed_url.path)
+            
+            if not filename:
+                # Generate a filename from the URL if none is found
+                filename = f"download_{hash(url) % 1000000}"
+                
+            # Determine file path
+            file_path = destination / filename
+            
+            # Create destination directory if it doesn't exist
+            destination.mkdir(parents=True, exist_ok=True)
+            
+            logger.info(f"Downloading {url} to {file_path}")
+            
+            # Download with progress tracking
+            self._download_with_progress(url, file_path)
+            
+            logger.info(f"Successfully downloaded to {file_path}")
+            return file_path
+            
+        except Exception as e:
+            logger.error(f"Failed to download {url}: {e}")
+            raise DownloadError(f"Failed to download {url}: {e}")
+
+    def _download_with_progress(self, url: str, file_path: Path) -> None:
+        """Download a file with progress tracking.
+        
+        Args:
+            url: URL to download from
+            file_path: Path to save the file to
+        """
+        try:
+            # Use requests for better progress tracking
+            with requests.get(url, stream=True, timeout=self.timeout) as response:
+                response.raise_for_status()
+                
+                # Get total file size
+                total_size = int(response.headers.get('content-length', 0))
+                
+                # Open file for writing
+                with open(file_path, 'wb') as f:
+                    downloaded = 0
+                    
+                    for chunk in response.iter_content(chunk_size=self.chunk_size):
+                        if chunk:  # Filter out keep-alive chunks
+                            f.write(chunk)
+                            downloaded += len(chunk)
+                            
+                            # Log progress
+                            if total_size > 0:
+                                percent = (downloaded / total_size) * 100
+                                logger.debug(f"Downloaded {downloaded}/{total_size} bytes ({percent:.1f}%)")
+                            else:
+                                logger.debug(f"Downloaded {downloaded} bytes")
+                                
+        except requests.exceptions.RequestException as e:
+            raise DownloadError(f"Request failed: {e}")
+        except IOError as e:
+            raise DownloadError(f"File write failed: {e}")
+
+    def get_filename_from_url(self, url: str) -> str:
+        """Extract filename from URL.
+        
+        Args:
+            url: URL to extract filename from
+            
+        Returns:
+            Filename extracted from URL
+        """
+        parsed_url = urllib.parse.urlparse(url)
+        filename = os.path.basename(parsed_url.path)
+        
+        if not filename:
+            # Try to get filename from Content-Disposition header
+            try:
+                with requests.head(url, timeout=self.timeout) as response:
+                    content_disposition = response.headers.get('content-disposition')
+                    if content_disposition and 'filename=' in content_disposition:
+                        # Extract filename from Content-Disposition
+                        parts = content_disposition.split('filename=')
+                        if len(parts) > 1:
+                            filename = parts[1].strip('"')
+            except requests.exceptions.RequestException:
+                pass
+                
+        if not filename:
+            # Generate a filename if none is found
+            filename = f"download_{hash(url) % 1000000}"
+            
+        return filename
+
+    def is_url_accessible(self, url: str) -> bool:
+        """Check if a URL is accessible.
+        
+        Args:
+            url: URL to check
+            
+        Returns:
+            True if URL is accessible, False otherwise
+        """
+        try:
+            with requests.head(url, timeout=self.timeout) as response:
+                return response.status_code < 400
+        except requests.exceptions.RequestException:
+            return False

+ 56 - 0
src/autusm/exceptions.py

@@ -0,0 +1,56 @@
+"""
+Custom exceptions for autusm package.
+
+This module defines the custom exception classes used throughout the autusm
+package to provide better error handling and user feedback.
+"""
+
+
+class AutusmError(Exception):
+    """Base exception class for all autusm errors."""
+    pass
+
+
+class DownloadError(AutusmError):
+    """Exception raised when downloading fails."""
+    pass
+
+
+class ExtractionError(AutusmError):
+    """Exception raised when archive extraction fails."""
+    pass
+
+
+class AnalysisError(AutusmError):
+    """Exception raised when source code analysis fails."""
+    pass
+
+
+class MetadataError(AutusmError):
+    """Exception raised when metadata extraction fails."""
+    pass
+
+
+class ScriptGenerationError(AutusmError):
+    """Exception raised when script generation fails."""
+    pass
+
+
+class ManifestGenerationError(AutusmError):
+    """Exception raised when manifest generation fails."""
+    pass
+
+
+class USMIntegrationError(AutusmError):
+    """Exception raised when USM integration fails."""
+    pass
+
+
+class ValidationError(AutusmError):
+    """Exception raised when validation fails."""
+    pass
+
+
+class ConfigurationError(AutusmError):
+    """Exception raised when there's a configuration error."""
+    pass

+ 253 - 0
src/autusm/extractor.py

@@ -0,0 +1,253 @@
+"""
+Archive extractor for autusm.
+
+This module provides functionality to extract various archive formats
+including tar, tar.gz, tar.bz2, and zip.
+"""
+
+import os
+import logging
+import tarfile
+import zipfile
+from pathlib import Path
+from typing import List, Optional
+
+from .exceptions import ExtractionError
+
+
+logger = logging.getLogger(__name__)
+
+
+class ArchiveExtractor:
+    """Extractor for various archive formats."""
+
+    def __init__(self):
+        """Initialize the archive extractor."""
+        self.supported_formats = {
+            '.tar': self._extract_tar,
+            '.tar.gz': self._extract_tar_gz,
+            '.tgz': self._extract_tar_gz,
+            '.tar.bz2': self._extract_tar_bz2,
+            '.tbz2': self._extract_tar_bz2,
+            '.zip': self._extract_zip,
+        }
+
+    def extract(self, archive_path: Path, destination: Path) -> Path:
+        """Extract an archive to the destination directory.
+        
+        Args:
+            archive_path: Path to the archive file
+            destination: Directory to extract to
+            
+        Returns:
+            Path to the extracted source directory
+            
+        Raises:
+            ExtractionError: If extraction fails
+        """
+        try:
+            # Determine archive format
+            archive_format = self._detect_format(archive_path)
+            
+            if not archive_format:
+                raise ExtractionError(f"Unsupported archive format: {archive_path}")
+            
+            # Create destination directory if it doesn't exist
+            destination.mkdir(parents=True, exist_ok=True)
+            
+            logger.info(f"Extracting {archive_path} (format: {archive_format}) to {destination}")
+            
+            # Extract the archive
+            extract_func = self.supported_formats[archive_format]
+            extract_func(archive_path, destination)
+            
+            # Find the extracted source directory
+            source_dir = self._find_source_directory(destination)
+            
+            logger.info(f"Successfully extracted to {source_dir}")
+            return source_dir
+            
+        except Exception as e:
+            logger.error(f"Failed to extract {archive_path}: {e}")
+            raise ExtractionError(f"Failed to extract {archive_path}: {e}")
+
+    def _detect_format(self, archive_path: Path) -> Optional[str]:
+        """Detect the archive format from the file extension.
+        
+        Args:
+            archive_path: Path to the archive file
+            
+        Returns:
+            Archive format string or None if unsupported
+        """
+        archive_name = archive_path.name.lower()
+        
+        for format_suffix in sorted(self.supported_formats.keys(), key=len, reverse=True):
+            if archive_name.endswith(format_suffix):
+                return format_suffix
+                
+        return None
+
+    def _extract_tar(self, archive_path: Path, destination: Path) -> None:
+        """Extract a plain tar archive.
+        
+        Args:
+            archive_path: Path to the tar archive
+            destination: Directory to extract to
+        """
+        with tarfile.open(archive_path, 'r:') as tar:
+            # Check for security issues
+            self._check_tar_security(tar)
+            tar.extractall(destination)
+
+    def _extract_tar_gz(self, archive_path: Path, destination: Path) -> None:
+        """Extract a gzipped tar archive.
+        
+        Args:
+            archive_path: Path to the tar.gz archive
+            destination: Directory to extract to
+        """
+        with tarfile.open(archive_path, 'r:gz') as tar:
+            # Check for security issues
+            self._check_tar_security(tar)
+            tar.extractall(destination)
+
+    def _extract_tar_bz2(self, archive_path: Path, destination: Path) -> None:
+        """Extract a bzip2-compressed tar archive.
+        
+        Args:
+            archive_path: Path to the tar.bz2 archive
+            destination: Directory to extract to
+        """
+        with tarfile.open(archive_path, 'r:bz2') as tar:
+            # Check for security issues
+            self._check_tar_security(tar)
+            tar.extractall(destination)
+
+    def _extract_zip(self, archive_path: Path, destination: Path) -> None:
+        """Extract a zip archive.
+        
+        Args:
+            archive_path: Path to the zip archive
+            destination: Directory to extract to
+        """
+        with zipfile.ZipFile(archive_path, 'r') as zip_ref:
+            # Check for security issues
+            self._check_zip_security(zip_ref)
+            zip_ref.extractall(destination)
+
+    def _check_tar_security(self, tar: tarfile.TarFile) -> None:
+        """Check tar file for security issues.
+        
+        Args:
+            tar: TarFile object to check
+            
+        Raises:
+            ExtractionError: If security issues are found
+        """
+        for member in tar.getmembers():
+            # Check for absolute paths
+            if os.path.isabs(member.name) or member.name.startswith("/"):
+                raise ExtractionError(f"Archive contains absolute path: {member.name}")
+            
+            # Check for path traversal attempts
+            if ".." in member.name:
+                raise ExtractionError(f"Archive contains potentially dangerous path: {member.name}")
+
+    def _check_zip_security(self, zip_ref: zipfile.ZipFile) -> None:
+        """Check zip file for security issues.
+        
+        Args:
+            zip_ref: ZipFile object to check
+            
+        Raises:
+            ExtractionError: If security issues are found
+        """
+        for member in zip_ref.infolist():
+            # Check for absolute paths
+            if os.path.isabs(member.filename):
+                raise ExtractionError(f"Archive contains absolute path: {member.filename}")
+            
+            # Check for path traversal attempts
+            if ".." in member.filename or member.filename.startswith("/"):
+                raise ExtractionError(f"Archive contains potentially dangerous path: {member.filename}")
+
+    def _find_source_directory(self, destination: Path) -> Path:
+        """Find the main source directory after extraction.
+        
+        Args:
+            destination: Directory where archive was extracted
+            
+        Returns:
+            Path to the main source directory
+        """
+        # List all items in the destination directory
+        items = [item for item in destination.iterdir() if item.is_dir()]
+        
+        # If there's only one directory, that's likely the source directory
+        if len(items) == 1:
+            return items[0]
+        
+        # If there are multiple directories, try to find the most likely source directory
+        # by looking for common indicators
+        for item in items:
+            if self._is_likely_source_dir(item):
+                return item
+        
+        # If no clear source directory is found, return the destination itself
+        return destination
+
+    def _is_likely_source_dir(self, path: Path) -> bool:
+        """Check if a directory is likely the main source directory.
+        
+        Args:
+            path: Directory path to check
+            
+        Returns:
+            True if likely the main source directory
+        """
+        # Look for common source directory indicators
+        indicators = [
+            "configure", "configure.ac", "configure.in",
+            "CMakeLists.txt", "meson.build",
+            "Makefile", "makefile",
+            "setup.py", "pyproject.toml",
+            "Cargo.toml", "package.json",
+            "src", "lib", "include"
+        ]
+        
+        for item in path.iterdir():
+            if item.name in indicators:
+                return True
+                
+        return False
+
+    def list_contents(self, archive_path: Path) -> List[str]:
+        """List the contents of an archive without extracting.
+        
+        Args:
+            archive_path: Path to the archive file
+            
+        Returns:
+            List of file paths in the archive
+            
+        Raises:
+            ExtractionError: If listing fails
+        """
+        try:
+            archive_format = self._detect_format(archive_path)
+            
+            if not archive_format:
+                raise ExtractionError(f"Unsupported archive format: {archive_path}")
+            
+            if archive_format in ['.tar', '.tar.gz', '.tgz', '.tar.bz2', '.tbz2']:
+                with tarfile.open(archive_path, 'r:*') as tar:
+                    return tar.getnames()
+            elif archive_format == '.zip':
+                with zipfile.ZipFile(archive_path, 'r') as zip_ref:
+                    return zip_ref.namelist()
+            else:
+                raise ExtractionError(f"Unsupported archive format: {archive_format}")
+                
+        except Exception as e:
+            raise ExtractionError(f"Failed to list contents of {archive_path}: {e}")

+ 752 - 0
src/autusm/generator.py

@@ -0,0 +1,752 @@
+"""
+Script generator for autusm.
+
+This module provides functionality to generate USM-compatible scripts
+for acquiring, building, and installing packages.
+"""
+
+import os
+import logging
+from pathlib import Path
+from typing import Dict, List, Optional
+
+from .models import BuildSystem, BuildSystemType, PackageInfo
+from .exceptions import ScriptGenerationError
+
+
+logger = logging.getLogger(__name__)
+
+
+class ScriptGenerator:
+    """Generator for USM-compatible scripts."""
+
+    def __init__(self):
+        """Initialize the script generator."""
+        # Script templates for different build systems
+        self.script_templates = {
+            BuildSystemType.AUTOTOOLS: {
+                "acquire": self._autotools_acquire_template,
+                "build": self._autotools_build_template,
+                "install": self._autotools_install_template
+            },
+            BuildSystemType.CMAKE: {
+                "acquire": self._cmake_acquire_template,
+                "build": self._cmake_build_template,
+                "install": self._cmake_install_template
+            },
+            BuildSystemType.MESON: {
+                "acquire": self._meson_acquire_template,
+                "build": self._meson_build_template,
+                "install": self._meson_install_template
+            },
+            BuildSystemType.MAKE: {
+                "acquire": self._make_acquire_template,
+                "build": self._make_build_template,
+                "install": self._make_install_template
+            },
+            BuildSystemType.PYTHON: {
+                "acquire": self._python_acquire_template,
+                "build": self._python_build_template,
+                "install": self._python_install_template
+            },
+            BuildSystemType.CARGO: {
+                "acquire": self._cargo_acquire_template,
+                "build": self._cargo_build_template,
+                "install": self._cargo_install_template
+            },
+            BuildSystemType.NPM: {
+                "acquire": self._npm_acquire_template,
+                "build": self._npm_build_template,
+                "install": self._npm_install_template
+            },
+            BuildSystemType.UNKNOWN: {
+                "acquire": self._generic_acquire_template,
+                "build": self._generic_build_template,
+                "install": self._generic_install_template
+            }
+        }
+
+    def generate_scripts(self, package_info: PackageInfo, build_system: BuildSystem, output_dir: Path) -> None:
+        """Generate USM scripts for a package.
+        
+        Args:
+            package_info: Package information
+            build_system: Detected build system
+            output_dir: Directory to write scripts to
+            
+        Raises:
+            ScriptGenerationError: If script generation fails
+        """
+        try:
+            logger.info(f"Generating USM scripts for {package_info.name}")
+            
+            # Create output directory if it doesn't exist
+            output_dir.mkdir(parents=True, exist_ok=True)
+            
+            # Get script templates for the build system
+            templates = self.script_templates.get(
+                build_system.type,
+                self.script_templates[BuildSystemType.UNKNOWN]
+            )
+            
+            # Generate acquire script
+            acquire_content = templates["acquire"](package_info, build_system)
+            acquire_path = output_dir / "acquire"
+            self._write_script(acquire_path, acquire_content)
+            
+            # Generate build script
+            build_content = templates["build"](package_info, build_system)
+            build_path = output_dir / "build"
+            self._write_script(build_path, build_content)
+            
+            # Generate install script
+            install_content = templates["install"](package_info, build_system)
+            install_path = output_dir / "install"
+            self._write_script(install_path, install_content)
+            
+            logger.info(f"Generated scripts in {output_dir}")
+            
+        except Exception as e:
+            logger.error(f"Failed to generate scripts: {e}")
+            raise ScriptGenerationError(f"Failed to generate scripts: {e}")
+
+    def _write_script(self, script_path: Path, content: str) -> None:
+        """Write a script file with proper permissions.
+        
+        Args:
+            script_path: Path to the script file
+            content: Script content
+        """
+        with open(script_path, 'w', encoding='utf-8') as f:
+            f.write(content)
+        
+        # Make the script executable
+        os.chmod(script_path, 0o755)
+
+    def _autotools_acquire_template(self, package_info: PackageInfo, build_system: BuildSystem) -> str:
+        """Generate acquire script for autotools-based packages."""
+        return f"""#!/bin/sh
+# Acquire script for {package_info.name}
+
+# This script is used to download the source code
+# It's typically used when creating USM packages from upstream sources
+
+# For autotools packages, the source is usually already available
+# If you need to download from a specific URL, uncomment and modify:
+# wget -O - {package_info.url or 'https://example.com/source.tar.gz'} | tar -xz
+
+# Or if using git:
+# git clone {package_info.url or 'https://github.com/example/repo.git'} .
+
+echo "Source acquired for {package_info.name}"
+"""
+
+    def _autotools_build_template(self, package_info: PackageInfo, build_system: BuildSystem) -> str:
+        """Generate build script for autotools-based packages."""
+        custom_args = ""
+        if build_system.custom_args:
+            for key, value in build_system.custom_args.items():
+                custom_args += f" {value}"
+        
+        return f"""#!/bin/sh
+# Build script for {package_info.name}
+
+# Build directory is passed as the first argument
+BUILD_DIR="$1"
+
+if [ -z "$BUILD_DIR" ]; then
+    echo "Error: Build directory not specified"
+    exit 1
+fi
+
+echo "Building {package_info.name} in $BUILD_DIR"
+
+# Prepare the build
+if [ -f "autogen.sh" ]; then
+    echo "Running autogen.sh..."
+    ./autogen.sh
+elif [ -f "bootstrap" ]; then
+    echo "Running bootstrap..."
+    ./bootstrap
+fi
+
+# Configure the build
+echo "Configuring the build..."
+./configure --prefix=/usr{custom_args}
+
+# Build the package
+echo "Building..."
+make -j$(nproc)
+
+echo "Build completed for {package_info.name}"
+"""
+
+    def _autotools_install_template(self, package_info: PackageInfo, build_system: BuildSystem) -> str:
+        """Generate install script for autotools-based packages."""
+        return f"""#!/bin/sh
+# Install script for {package_info.name}
+
+# Build directory is passed as the first argument
+# Install directory is passed as the second argument
+# Install type is passed as the third argument (fresh, upgrade, downgrade)
+
+BUILD_DIR="$1"
+INSTALL_DIR="$2"
+INSTALL_TYPE="$3"
+
+if [ -z "$BUILD_DIR" ] || [ -z "$INSTALL_DIR" ]; then
+    echo "Error: Build directory or install directory not specified"
+    exit 1
+fi
+
+echo "Installing {package_info.name} to $INSTALL_DIR (type: $INSTALL_TYPE)"
+
+# Change to the source directory
+cd "$BUILD_DIR"
+
+# Install the package
+echo "Installing..."
+make DESTDIR="$INSTALL_DIR" install
+
+echo "Installation completed for {package_info.name}"
+"""
+
+    def _cmake_acquire_template(self, package_info: PackageInfo, build_system: BuildSystem) -> str:
+        """Generate acquire script for CMake-based packages."""
+        return f"""#!/bin/sh
+# Acquire script for {package_info.name}
+
+# This script is used to download the source code
+# It's typically used when creating USM packages from upstream sources
+
+# For CMake packages, the source is usually already available
+# If you need to download from a specific URL, uncomment and modify:
+# wget -O - {package_info.url or 'https://example.com/source.tar.gz'} | tar -xz
+
+# Or if using git:
+# git clone {package_info.url or 'https://github.com/example/repo.git'} .
+
+echo "Source acquired for {package_info.name}"
+"""
+
+    def _cmake_build_template(self, package_info: PackageInfo, build_system: BuildSystem) -> str:
+        """Generate build script for CMake-based packages."""
+        custom_args = ""
+        if build_system.custom_args:
+            for key, value in build_system.custom_args.items():
+                custom_args += f" -D{key}={value}"
+        
+        return f"""#!/bin/sh
+# Build script for {package_info.name}
+
+# Build directory is passed as the first argument
+BUILD_DIR="$1"
+
+if [ -z "$BUILD_DIR" ]; then
+    echo "Error: Build directory not specified"
+    exit 1
+fi
+
+echo "Building {package_info.name} in $BUILD_DIR"
+
+# Create build directory
+mkdir -p "$BUILD_DIR"
+cd "$BUILD_DIR"
+
+# Configure the build
+echo "Configuring with CMake..."
+cmake -DCMAKE_INSTALL_PREFIX=/usr{custom_args} "$PWD"/..
+
+# Build the package
+echo "Building..."
+make -j$(nproc)
+
+echo "Build completed for {package_info.name}"
+"""
+
+    def _cmake_install_template(self, package_info: PackageInfo, build_system: BuildSystem) -> str:
+        """Generate install script for CMake-based packages."""
+        return f"""#!/bin/sh
+# Install script for {package_info.name}
+
+# Build directory is passed as the first argument
+# Install directory is passed as the second argument
+# Install type is passed as the third argument (fresh, upgrade, downgrade)
+
+BUILD_DIR="$1"
+INSTALL_DIR="$2"
+INSTALL_TYPE="$3"
+
+if [ -z "$BUILD_DIR" ] || [ -z "$INSTALL_DIR" ]; then
+    echo "Error: Build directory or install directory not specified"
+    exit 1
+fi
+
+echo "Installing {package_info.name} to $INSTALL_DIR (type: $INSTALL_TYPE)"
+
+# Change to the build directory
+cd "$BUILD_DIR"
+
+# Install the package
+echo "Installing..."
+make DESTDIR="$INSTALL_DIR" install
+
+echo "Installation completed for {package_info.name}"
+"""
+
+    def _meson_acquire_template(self, package_info: PackageInfo, build_system: BuildSystem) -> str:
+        """Generate acquire script for Meson-based packages."""
+        return f"""#!/bin/sh
+# Acquire script for {package_info.name}
+
+# This script is used to download the source code
+# It's typically used when creating USM packages from upstream sources
+
+# For Meson packages, the source is usually already available
+# If you need to download from a specific URL, uncomment and modify:
+# wget -O - {package_info.url or 'https://example.com/source.tar.gz'} | tar -xz
+
+# Or if using git:
+# git clone {package_info.url or 'https://github.com/example/repo.git'} .
+
+echo "Source acquired for {package_info.name}"
+"""
+
+    def _meson_build_template(self, package_info: PackageInfo, build_system: BuildSystem) -> str:
+        """Generate build script for Meson-based packages."""
+        custom_args = ""
+        if build_system.custom_args:
+            for key, value in build_system.custom_args.items():
+                custom_args += f" -D{key}={value}"
+        
+        return f"""#!/bin/sh
+# Build script for {package_info.name}
+
+# Build directory is passed as the first argument
+BUILD_DIR="$1"
+
+if [ -z "$BUILD_DIR" ]; then
+    echo "Error: Build directory not specified"
+    exit 1
+fi
+
+echo "Building {package_info.name} in $BUILD_DIR"
+
+# Configure the build
+echo "Configuring with Meson..."
+meson setup "$BUILD_DIR" --prefix=/usr{custom_args}
+
+# Build the package
+echo "Building..."
+meson compile -C "$BUILD_DIR"
+
+echo "Build completed for {package_info.name}"
+"""
+
+    def _meson_install_template(self, package_info: PackageInfo, build_system: BuildSystem) -> str:
+        """Generate install script for Meson-based packages."""
+        return f"""#!/bin/sh
+# Install script for {package_info.name}
+
+# Build directory is passed as the first argument
+# Install directory is passed as the second argument
+# Install type is passed as the third argument (fresh, upgrade, downgrade)
+
+BUILD_DIR="$1"
+INSTALL_DIR="$2"
+INSTALL_TYPE="$3"
+
+if [ -z "$BUILD_DIR" ] || [ -z "$INSTALL_DIR" ]; then
+    echo "Error: Build directory or install directory not specified"
+    exit 1
+fi
+
+echo "Installing {package_info.name} to $INSTALL_DIR (type: $INSTALL_TYPE)"
+
+# Install the package
+echo "Installing..."
+meson install -C "$BUILD_DIR" --destdir "$INSTALL_DIR"
+
+echo "Installation completed for {package_info.name}"
+"""
+
+    def _make_acquire_template(self, package_info: PackageInfo, build_system: BuildSystem) -> str:
+        """Generate acquire script for Make-based packages."""
+        return f"""#!/bin/sh
+# Acquire script for {package_info.name}
+
+# This script is used to download the source code
+# It's typically used when creating USM packages from upstream sources
+
+# For Make packages, the source is usually already available
+# If you need to download from a specific URL, uncomment and modify:
+# wget -O - {package_info.url or 'https://example.com/source.tar.gz'} | tar -xz
+
+# Or if using git:
+# git clone {package_info.url or 'https://github.com/example/repo.git'} .
+
+echo "Source acquired for {package_info.name}"
+"""
+
+    def _make_build_template(self, package_info: PackageInfo, build_system: BuildSystem) -> str:
+        """Generate build script for Make-based packages."""
+        return f"""#!/bin/sh
+# Build script for {package_info.name}
+
+# Build directory is passed as the first argument
+BUILD_DIR="$1"
+
+if [ -z "$BUILD_DIR" ]; then
+    echo "Error: Build directory not specified"
+    exit 1
+fi
+
+echo "Building {package_info.name} in $BUILD_DIR"
+
+# Build the package
+echo "Building..."
+make -j$(nproc)
+
+echo "Build completed for {package_info.name}"
+"""
+
+    def _make_install_template(self, package_info: PackageInfo, build_system: BuildSystem) -> str:
+        """Generate install script for Make-based packages."""
+        return f"""#!/bin/sh
+# Install script for {package_info.name}
+
+# Build directory is passed as the first argument
+# Install directory is passed as the second argument
+# Install type is passed as the third argument (fresh, upgrade, downgrade)
+
+BUILD_DIR="$1"
+INSTALL_DIR="$2"
+INSTALL_TYPE="$3"
+
+if [ -z "$BUILD_DIR" ] || [ -z "$INSTALL_DIR" ]; then
+    echo "Error: Build directory or install directory not specified"
+    exit 1
+fi
+
+echo "Installing {package_info.name} to $INSTALL_DIR (type: $INSTALL_TYPE)"
+
+# Change to the source directory
+cd "$BUILD_DIR"
+
+# Install the package
+echo "Installing..."
+make DESTDIR="$INSTALL_DIR" install
+
+echo "Installation completed for {package_info.name}"
+"""
+
+    def _python_acquire_template(self, package_info: PackageInfo, build_system: BuildSystem) -> str:
+        """Generate acquire script for Python-based packages."""
+        return f"""#!/bin/sh
+# Acquire script for {package_info.name}
+
+# This script is used to download the source code
+# It's typically used when creating USM packages from upstream sources
+
+# For Python packages, the source is usually already available
+# If you need to download from a specific URL, uncomment and modify:
+# wget -O - {package_info.url or 'https://example.com/source.tar.gz'} | tar -xz
+
+# Or if using git:
+# git clone {package_info.url or 'https://github.com/example/repo.git'} .
+
+echo "Source acquired for {package_info.name}"
+"""
+
+    def _python_build_template(self, package_info: PackageInfo, build_system: BuildSystem) -> str:
+        """Generate build script for Python-based packages."""
+        return f"""#!/bin/sh
+# Build script for {package_info.name}
+
+# Build directory is passed as the first argument
+BUILD_DIR="$1"
+
+if [ -z "$BUILD_DIR" ]; then
+    echo "Error: Build directory not specified"
+    exit 1
+fi
+
+echo "Building {package_info.name} in $BUILD_DIR"
+
+# Build the package
+echo "Building..."
+python setup.py build
+
+echo "Build completed for {package_info.name}"
+"""
+
+    def _python_install_template(self, package_info: PackageInfo, build_system: BuildSystem) -> str:
+        """Generate install script for Python-based packages."""
+        return f"""#!/bin/sh
+# Install script for {package_info.name}
+
+# Build directory is passed as the first argument
+# Install directory is passed as the second argument
+# Install type is passed as the third argument (fresh, upgrade, downgrade)
+
+BUILD_DIR="$1"
+INSTALL_DIR="$2"
+INSTALL_TYPE="$3"
+
+if [ -z "$BUILD_DIR" ] || [ -z "$INSTALL_DIR" ]; then
+    echo "Error: Build directory or install directory not specified"
+    exit 1
+fi
+
+echo "Installing {package_info.name} to $INSTALL_DIR (type: $INSTALL_TYPE)"
+
+# Change to the source directory
+cd "$BUILD_DIR"
+
+# Install the package
+echo "Installing..."
+python setup.py install --root="$INSTALL_DIR" --prefix=/usr
+
+echo "Installation completed for {package_info.name}"
+"""
+
+    def _cargo_acquire_template(self, package_info: PackageInfo, build_system: BuildSystem) -> str:
+        """Generate acquire script for Cargo-based packages."""
+        return f"""#!/bin/sh
+# Acquire script for {package_info.name}
+
+# This script is used to download the source code
+# It's typically used when creating USM packages from upstream sources
+
+# For Cargo packages, the source is usually already available
+# If you need to download from a specific URL, uncomment and modify:
+# wget -O - {package_info.url or 'https://example.com/source.tar.gz'} | tar -xz
+
+# Or if using git:
+# git clone {package_info.url or 'https://github.com/example/repo.git'} .
+
+echo "Source acquired for {package_info.name}"
+"""
+
+    def _cargo_build_template(self, package_info: PackageInfo, build_system: BuildSystem) -> str:
+        """Generate build script for Cargo-based packages."""
+        return f"""#!/bin/sh
+# Build script for {package_info.name}
+
+# Build directory is passed as the first argument
+BUILD_DIR="$1"
+
+if [ -z "$BUILD_DIR" ]; then
+    echo "Error: Build directory not specified"
+    exit 1
+fi
+
+echo "Building {package_info.name} in $BUILD_DIR"
+
+# Build the package
+echo "Building..."
+cargo build --release
+
+echo "Build completed for {package_info.name}"
+"""
+
+    def _cargo_install_template(self, package_info: PackageInfo, build_system: BuildSystem) -> str:
+        """Generate install script for Cargo-based packages."""
+        return f"""#!/bin/sh
+# Install script for {package_info.name}
+
+# Build directory is passed as the first argument
+# Install directory is passed as the second argument
+# Install type is passed as the third argument (fresh, upgrade, downgrade)
+
+BUILD_DIR="$1"
+INSTALL_DIR="$2"
+INSTALL_TYPE="$3"
+
+if [ -z "$BUILD_DIR" ] || [ -z "$INSTALL_DIR" ]; then
+    echo "Error: Build directory or install directory not specified"
+    exit 1
+fi
+
+echo "Installing {package_info.name} to $INSTALL_DIR (type: $INSTALL_TYPE)"
+
+# Change to the source directory
+cd "$BUILD_DIR"
+
+# Install the package
+echo "Installing..."
+cargo install --path . --root "$INSTALL_DIR/usr"
+
+echo "Installation completed for {package_info.name}"
+"""
+
+    def _npm_acquire_template(self, package_info: PackageInfo, build_system: BuildSystem) -> str:
+        """Generate acquire script for NPM-based packages."""
+        return f"""#!/bin/sh
+# Acquire script for {package_info.name}
+
+# This script is used to download the source code
+# It's typically used when creating USM packages from upstream sources
+
+# For NPM packages, the source is usually already available
+# If you need to download from a specific URL, uncomment and modify:
+# wget -O - {package_info.url or 'https://example.com/source.tar.gz'} | tar -xz
+
+# Or if using git:
+# git clone {package_info.url or 'https://github.com/example/repo.git'} .
+
+echo "Source acquired for {package_info.name}"
+"""
+
+    def _npm_build_template(self, package_info: PackageInfo, build_system: BuildSystem) -> str:
+        """Generate build script for NPM-based packages."""
+        return f"""#!/bin/sh
+# Build script for {package_info.name}
+
+# Build directory is passed as the first argument
+BUILD_DIR="$1"
+
+if [ -z "$BUILD_DIR" ]; then
+    echo "Error: Build directory not specified"
+    exit 1
+fi
+
+echo "Building {package_info.name} in $BUILD_DIR"
+
+# Install dependencies
+echo "Installing dependencies..."
+npm ci
+
+# Build the package
+echo "Building..."
+npm run build
+
+echo "Build completed for {package_info.name}"
+"""
+
+    def _npm_install_template(self, package_info: PackageInfo, build_system: BuildSystem) -> str:
+        """Generate install script for NPM-based packages."""
+        return f"""#!/bin/sh
+# Install script for {package_info.name}
+
+# Build directory is passed as the first argument
+# Install directory is passed as the second argument
+# Install type is passed as the third argument (fresh, upgrade, downgrade)
+
+BUILD_DIR="$1"
+INSTALL_DIR="$2"
+INSTALL_TYPE="$3"
+
+if [ -z "$BUILD_DIR" ] || [ -z "$INSTALL_DIR" ]; then
+    echo "Error: Build directory or install directory not specified"
+    exit 1
+fi
+
+echo "Installing {package_info.name} to $INSTALL_DIR (type: $INSTALL_TYPE)"
+
+# Change to the source directory
+cd "$BUILD_DIR"
+
+# Install the package globally
+echo "Installing..."
+npm install -g --prefix="$INSTALL_DIR/usr"
+
+echo "Installation completed for {package_info.name}"
+"""
+
+    def _generic_acquire_template(self, package_info: PackageInfo, build_system: BuildSystem) -> str:
+        """Generic acquire script for unknown build systems."""
+        return f"""#!/bin/sh
+# Acquire script for {package_info.name}
+
+# This script is used to download the source code
+# It's typically used when creating USM packages from upstream sources
+
+# For unknown build systems, the source is usually already available
+# If you need to download from a specific URL, uncomment and modify:
+# wget -O - {package_info.url or 'https://example.com/source.tar.gz'} | tar -xz
+
+# Or if using git:
+# git clone {package_info.url or 'https://github.com/example/repo.git'} .
+
+echo "Source acquired for {package_info.name}"
+"""
+
+    def _generic_build_template(self, package_info: PackageInfo, build_system: BuildSystem) -> str:
+        """Generic build script for unknown build systems."""
+        return f"""#!/bin/sh
+# Build script for {package_info.name}
+
+# Build directory is passed as the first argument
+BUILD_DIR="$1"
+
+if [ -z "$BUILD_DIR" ]; then
+    echo "Error: Build directory not specified"
+    exit 1
+fi
+
+echo "Building {package_info.name} in $BUILD_DIR"
+
+# This is a generic build script for unknown build systems
+# You may need to customize this based on the actual build system
+
+# Try common build commands
+if [ -f "Makefile" ] || [ -f "makefile" ]; then
+    echo "Found Makefile, using make..."
+    make -j$(nproc)
+elif [ -f "build.sh" ]; then
+    echo "Found build.sh, running it..."
+    ./build.sh
+elif [ -f "CMakeLists.txt" ]; then
+    echo "Found CMakeLists.txt, using CMake..."
+    mkdir -p "$BUILD_DIR"
+    cd "$BUILD_DIR"
+    cmake -DCMAKE_INSTALL_PREFIX=/usr ..
+    make -j$(nproc)
+else
+    echo "No known build system found. Please customize this script."
+    exit 1
+fi
+
+echo "Build completed for {package_info.name}"
+"""
+
+    def _generic_install_template(self, package_info: PackageInfo, build_system: BuildSystem) -> str:
+        """Generic install script for unknown build systems."""
+        return f"""#!/bin/sh
+# Install script for {package_info.name}
+
+# Build directory is passed as the first argument
+# Install directory is passed as the second argument
+# Install type is passed as the third argument (fresh, upgrade, downgrade)
+
+BUILD_DIR="$1"
+INSTALL_DIR="$2"
+INSTALL_TYPE="$3"
+
+if [ -z "$BUILD_DIR" ] || [ -z "$INSTALL_DIR" ]; then
+    echo "Error: Build directory or install directory not specified"
+    exit 1
+fi
+
+echo "Installing {package_info.name} to $INSTALL_DIR (type: $INSTALL_TYPE)"
+
+# This is a generic install script for unknown build systems
+# You may need to customize this based on the actual build system
+
+# Try common install commands
+if [ -f "Makefile" ] || [ -f "makefile" ]; then
+    echo "Found Makefile, using make install..."
+    cd "$BUILD_DIR"
+    make DESTDIR="$INSTALL_DIR" install
+elif [ -f "install.sh" ]; then
+    echo "Found install.sh, running it..."
+    cd "$BUILD_DIR"
+    ./install.sh --prefix="$INSTALL_DIR/usr"
+else
+    echo "No known install method found. Please customize this script."
+    exit 1
+fi
+
+echo "Installation completed for {package_info.name}"
+"""

+ 443 - 0
src/autusm/interaction.py

@@ -0,0 +1,443 @@
+"""
+User interaction for autusm.
+
+This module provides functionality to interact with users to gather
+missing information during the manifest generation process.
+"""
+
+import logging
+import re
+from pathlib import Path
+from typing import List, Optional, Dict, Any
+
+from .models import PackageInfo, License, LicenseCategory
+from .exceptions import ValidationError
+
+
+logger = logging.getLogger(__name__)
+
+
+class UserInteraction:
+    """Handler for user interactions."""
+
+    def __init__(self, interactive: bool = True):
+        """Initialize user interaction handler.
+        
+        Args:
+            interactive: Whether to run in interactive mode
+        """
+        self.interactive = interactive
+
+    def fill_missing_info(self, package_info: PackageInfo) -> PackageInfo:
+        """Fill in missing package information through user interaction.
+        
+        Args:
+            package_info: Package information with potentially missing fields
+            
+        Returns:
+            Updated PackageInfo object
+            
+        Raises:
+            ValidationError: If required information is missing and not in interactive mode
+        """
+        if not self.interactive:
+            # Validate required fields in non-interactive mode
+            self._validate_required_fields(package_info)
+            return package_info
+        
+        logger.info("Filling in missing package information")
+        
+        # Fill in basic information
+        if not package_info.name:
+            package_info.name = self._ask_string("Package name", required=True)
+            package_info.name = self._sanitize_package_name(package_info.name)
+        
+        if not package_info.version:
+            package_info.version = self._ask_string("Package version", required=True)
+            package_info.version = self._sanitize_version(package_info.version)
+        
+        if not package_info.summary:
+            package_info.summary = self._ask_string("Package summary", required=True)
+        
+        if not package_info.description:
+            package_info.description = self._ask_multiline("Package description (optional)")
+        
+        if not package_info.url:
+            package_info.url = self._ask_string("Project URL (optional)")
+        
+        # Fill in author information
+        if not package_info.authors:
+            author = self._ask_string("Author name (optional)")
+            if author:
+                package_info.authors.append(author)
+        
+        # Fill in license information
+        if not package_info.licenses:
+            package_info.licenses = self._ask_license_info()
+        
+        # Fill in dependencies
+        if not package_info.runtime_dependencies:
+            deps = self._ask_dependencies("Runtime dependencies")
+            package_info.runtime_dependencies = deps
+        
+        if not package_info.build_dependencies:
+            deps = self._ask_dependencies("Build dependencies")
+            package_info.build_dependencies = deps
+        
+        return package_info
+
+    def _validate_required_fields(self, package_info: PackageInfo) -> None:
+        """Validate that required fields are present.
+        
+        Args:
+            package_info: Package information to validate
+            
+        Raises:
+            ValidationError: If required fields are missing
+        """
+        errors = []
+        
+        if not package_info.name:
+            errors.append("Package name is required")
+        
+        if not package_info.version:
+            errors.append("Package version is required")
+        
+        if not package_info.summary:
+            errors.append("Package summary is required")
+        
+        if errors:
+            raise ValidationError("Missing required information: " + ", ".join(errors))
+
+    def _ask_string(self, prompt: str, required: bool = False, default: Optional[str] = None) -> str:
+        """Ask the user for a string input.
+        
+        Args:
+            prompt: Prompt message
+            required: Whether the input is required
+            default: Default value
+            
+        Returns:
+            User input string
+        """
+        if default:
+            full_prompt = f"{prompt} [{default}]: "
+        else:
+            full_prompt = f"{prompt}: "
+        
+        while True:
+            try:
+                answer = input(full_prompt).strip()
+                
+                if not answer:
+                    if default:
+                        return default
+                    elif required:
+                        print("This field is required.")
+                        continue
+                
+                return answer
+            except (EOFError, KeyboardInterrupt):
+                print("\nOperation cancelled by user.")
+                raise
+
+    def _ask_multiline(self, prompt: str, required: bool = False) -> str:
+        """Ask the user for multiline input.
+        
+        Args:
+            prompt: Prompt message
+            required: Whether the input is required
+            
+        Returns:
+            User input string
+        """
+        print(f"{prompt} (Ctrl-D to finish):")
+        
+        lines = []
+        try:
+            while True:
+                line = input()
+                lines.append(line)
+        except (EOFError, KeyboardInterrupt):
+            pass
+        
+        result = "\n".join(lines).strip()
+        
+        if not result and required:
+            print("This field is required.")
+            return self._ask_multiline(prompt, required)
+        
+        return result
+
+    def _ask_license_info(self) -> List[License]:
+        """Ask the user for license information.
+        
+        Returns:
+            List of License objects
+        """
+        licenses = []
+        
+        print("\nLicense Information:")
+        print("Common license identifiers: MIT, Apache-2.0, GPL-3.0, BSD-3-Clause, etc.")
+        
+        while True:
+            # Ask for license name
+            license_name = self._ask_string("License name (or empty to finish)", required=False)
+            
+            if not license_name:
+                break
+            
+            # Map to category
+            category = self._map_license_category(license_name)
+            
+            # Ask for license file path
+            license_file = self._ask_string(
+                "License file path", 
+                required=False, 
+                default="LICENSE"
+            )
+            
+            licenses.append(License(
+                name=license_name,
+                text=license_file,
+                category=category
+            ))
+        
+        if not licenses:
+            # Add a default license
+            licenses.append(License(
+                name="Unknown",
+                text="LICENSE",
+                category=LicenseCategory.SOURCE_AVAILABLE
+            ))
+        
+        return licenses
+
+    def _ask_dependencies(self, dep_type: str) -> List[str]:
+        """Ask the user for dependencies.
+        
+        Args:
+            dep_type: Type of dependencies (runtime, build, etc.)
+            
+        Returns:
+            List of dependency strings
+        """
+        print(f"\n{dep_type} Dependencies:")
+        print("Enter dependencies one per line, empty line to finish:")
+        
+        dependencies = []
+        
+        try:
+            while True:
+                dep = input("> ").strip()
+                if not dep:
+                    break
+                dependencies.append(dep)
+        except (EOFError, KeyboardInterrupt):
+            pass
+        
+        return dependencies
+
+    def _map_license_category(self, license_name: str) -> LicenseCategory:
+        """Map a license name to a license category.
+        
+        Args:
+            license_name: Name of the license
+            
+        Returns:
+            LicenseCategory enum value
+        """
+        license_lower = license_name.lower()
+        
+        # Libre licenses
+        libre_licenses = [
+            "mit", "apache", "apache-2.0", "gpl", "gpl-2.0", "gpl-3.0",
+            "lgpl", "lgpl-2.1", "lgpl-3.0", "bsd", "bsd-2-clause",
+            "bsd-3-clause", "isc", "mpl", "mpl-2.0", "agpl", "agpl-3.0",
+            "bsl", "bsl-1.0", "unlicense", "cc0", "cc0-1.0", "epl",
+            "epl-1.0", "epl-2.0"
+        ]
+        
+        if any(libre in license_lower for libre in libre_licenses):
+            return LicenseCategory.LIBRE
+        
+        # Proprietary licenses
+        proprietary_licenses = ["proprietary", "commercial"]
+        if any(prop in license_lower for prop in proprietary_licenses):
+            return LicenseCategory.PROPRIETARY
+        
+        # Default to source-available
+        return LicenseCategory.SOURCE_AVAILABLE
+
+    def _sanitize_package_name(self, name: str) -> str:
+        """Sanitize a package name.
+        
+        Args:
+            name: Package name to sanitize
+            
+        Returns:
+            Sanitized package name
+        """
+        # Remove spaces and special characters
+        sanitized = re.sub(r'[^a-zA-Z0-9_\-]', '', name)
+        
+        # Replace spaces with underscores
+        sanitized = re.sub(r'\s+', '_', name)
+        
+        # Convert to lowercase
+        sanitized = sanitized.lower()
+        
+        return sanitized
+
+    def _sanitize_version(self, version: str) -> str:
+        """Sanitize a version string.
+        
+        Args:
+            version: Version string to sanitize
+            
+        Returns:
+            Sanitized version string
+        """
+        # Remove leading/trailing whitespace
+        version = version.strip()
+        
+        # Basic validation for semantic versioning
+        # This is simplified - a real implementation would be more rigorous
+        if not re.match(r'^\d+(\.\d+)*([+-].+)?$', version):
+            print(f"Warning: '{version}' may not be a valid version format")
+        
+        return version
+
+    def confirm_action(self, message: str, default: bool = True) -> bool:
+        """Ask the user to confirm an action.
+        
+        Args:
+            message: Confirmation message
+            default: Default answer if user just presses Enter
+            
+        Returns:
+            True if user confirms, False otherwise
+        """
+        if not self.interactive:
+            return default
+        
+        suffix = "Y/n" if default else "y/N"
+        
+        while True:
+            try:
+                answer = input(f"{message} ({suffix}): ").strip().lower()
+                
+                if not answer:
+                    return default
+                
+                if answer in ['y', 'yes']:
+                    return True
+                elif answer in ['n', 'no']:
+                    return False
+                else:
+                    print("Please enter 'y' or 'n'.")
+            except (EOFError, KeyboardInterrupt):
+                print("\nOperation cancelled by user.")
+                raise
+
+    def select_option(self, message: str, options: List[str], default: Optional[int] = None) -> int:
+        """Ask the user to select from a list of options.
+        
+        Args:
+            message: Selection message
+            options: List of option strings
+            default: Default option index
+            
+        Returns:
+            Selected option index
+        """
+        if not self.interactive:
+            return default or 0
+        
+        print(f"\n{message}:")
+        
+        for i, option in enumerate(options, 1):
+            print(f"  {i}. {option}")
+        
+        if default is not None:
+            prompt = f"Select option (1-{len(options)}) [{default}]: "
+        else:
+            prompt = f"Select option (1-{len(options)}): "
+        
+        while True:
+            try:
+                answer = input(prompt).strip()
+                
+                if not answer and default is not None:
+                    return default
+                
+                try:
+                    selection = int(answer)
+                    if 1 <= selection <= len(options):
+                        return selection - 1
+                    else:
+                        print(f"Please enter a number between 1 and {len(options)}.")
+                except ValueError:
+                    print("Please enter a valid number.")
+            except (EOFError, KeyboardInterrupt):
+                print("\nOperation cancelled by user.")
+                raise
+
+    def show_summary(self, package_info: PackageInfo) -> None:
+        """Show a summary of the package information.
+        
+        Args:
+            package_info: Package information to display
+        """
+        print("\n" + "="*50)
+        print("PACKAGE SUMMARY")
+        print("="*50)
+        print(f"Name: {package_info.name}")
+        print(f"Version: {package_info.version}")
+        print(f"Summary: {package_info.summary}")
+        
+        if package_info.description:
+            print(f"Description: {package_info.description}")
+        
+        if package_info.url:
+            print(f"URL: {package_info.url}")
+        
+        if package_info.authors:
+            print(f"Authors: {', '.join(package_info.authors)}")
+        
+        if package_info.licenses:
+            print("Licenses:")
+            for license in package_info.licenses:
+                print(f"  - {license.name} ({license.category.value})")
+        
+        if package_info.runtime_dependencies:
+            print(f"Runtime Dependencies: {', '.join(package_info.runtime_dependencies)}")
+        
+        if package_info.build_dependencies:
+            print(f"Build Dependencies: {', '.join(package_info.build_dependencies)}")
+        
+        print("="*50)
+
+    def edit_field(self, field_name: str, current_value: str) -> str:
+        """Allow the user to edit a field.
+        
+        Args:
+            field_name: Name of the field
+            current_value: Current value of the field
+            
+        Returns:
+            New field value
+        """
+        if not self.interactive:
+            return current_value
+        
+        print(f"\nEditing {field_name}")
+        print(f"Current value: {current_value}")
+        print("Enter new value (or press Enter to keep current):")
+        
+        try:
+            new_value = input("> ").strip()
+            return new_value if new_value else current_value
+        except (EOFError, KeyboardInterrupt):
+            print("\nOperation cancelled by user.")
+            raise

+ 417 - 0
src/autusm/manifest.py

@@ -0,0 +1,417 @@
+"""
+Manifest generator for autusm.
+
+This module provides functionality to generate USM-compatible MANIFEST.usm
+files from package information and build system details.
+"""
+
+import os
+import logging
+from pathlib import Path
+from typing import Dict, List, Optional, Set, Any
+
+from .models import (
+    PackageInfo, BuildSystem, BuildSystemType, USMManifest, License, LicenseCategory,
+    Dependencies, Executables, Resource, PathBase, FileType
+)
+from .exceptions import ManifestGenerationError
+
+
+logger = logging.getLogger(__name__)
+
+
+class ManifestGenerator:
+    """Generator for USM manifest files."""
+
+    def __init__(self):
+        """Initialize the manifest generator."""
+        # Common resource mappings for different file types
+        self.resource_mappings = {
+            "bin": ["bin/*", "usr/bin/*"],
+            "sbin": ["sbin/*", "usr/sbin/*"],
+            "lib": ["lib/*.so*", "lib64/*.so*", "usr/lib/*.so*", "usr/lib64/*.so*"],
+            "include": ["include/*.h", "usr/include/*.h"],
+            "share": ["share/*", "usr/share/*"],
+            "man": ["share/man/*", "usr/share/man/*"],
+            "doc": ["share/doc/*", "usr/share/doc/*"],
+            "config": ["etc/*"],
+            "desktop": ["share/applications/*.desktop"],
+            "icon": ["share/icons/*", "usr/share/icons/*"]
+        }
+
+    def generate(self, package_info: PackageInfo, build_system: BuildSystem) -> USMManifest:
+        """Generate a USM manifest from package information.
+        
+        Args:
+            package_info: Package information
+            build_system: Build system information
+            
+        Returns:
+            USMManifest object
+            
+        Raises:
+            ManifestGenerationError: If manifest generation fails
+        """
+        try:
+            logger.info(f"Generating USM manifest for {package_info.name}")
+            
+            # Validate required fields
+            if not package_info.name:
+                raise ManifestGenerationError("Package name is required")
+            if not package_info.version:
+                raise ManifestGenerationError("Package version is required")
+            if not package_info.summary:
+                raise ManifestGenerationError("Package summary is required")
+            
+            # Create dependencies
+            dependencies = Dependencies(
+                runtime=self._convert_to_resource_refs(package_info.runtime_dependencies),
+                build=self._convert_to_resource_refs(package_info.build_dependencies),
+                manage=[],
+                acquire=[]
+            )
+            
+            # Create provides section
+            provides = self._generate_provides(package_info, build_system)
+            
+            # Create executables
+            executables = self._generate_executables(package_info, build_system)
+            
+            # Create flags
+            flags = self._generate_flags(build_system)
+            
+            # Create the manifest
+            manifest = USMManifest(
+                name=package_info.name,
+                version=package_info.version,
+                summary=package_info.summary,
+                licences=package_info.licenses or [License(
+                    name="Unknown",
+                    text="LICENSE",
+                    category=LicenseCategory.SOURCE_AVAILABLE
+                )],
+                provides=provides,
+                depends=dependencies,
+                flags=flags,
+                execs=executables,
+                md=self._find_description_file(package_info),
+                url=package_info.url
+            )
+            
+            logger.info(f"Generated manifest for {package_info.name}")
+            return manifest
+            
+        except Exception as e:
+            logger.error(f"Failed to generate manifest: {e}")
+            raise ManifestGenerationError(f"Failed to generate manifest: {e}")
+
+    def _convert_to_resource_refs(self, dependencies: List[str]) -> List[str]:
+        """Convert dependency strings to resource references.
+        
+        Args:
+            dependencies: List of dependency strings
+            
+        Returns:
+            List of resource references
+        """
+        resource_refs = []
+        
+        for dep in dependencies:
+            # This is a simplified conversion
+            # In a real implementation, you'd need more sophisticated mapping
+            dep_lower = dep.lower()
+            
+            # Common package name to resource reference mappings
+            if dep_lower.startswith("lib"):
+                # Library dependency
+                lib_name = dep_lower.replace("lib", "")
+                resource_refs.append(f"lib:{dep_lower}")
+            elif dep_lower.endswith("-dev") or dep_lower.endswith("-devel"):
+                # Development dependency
+                base_name = dep_lower.replace("-dev", "").replace("-devel", "")
+                resource_refs.append(f"inc:{base_name}")
+            elif dep_lower in ["gcc", "clang", "rustc"]:
+                # Compiler dependency
+                resource_refs.append(f"bin:{dep_lower}")
+            elif dep_lower in ["python", "python3", "node", "nodejs"]:
+                # Runtime dependency
+                resource_refs.append(f"bin:{dep_lower}")
+            else:
+                # Default to bin for unknown dependencies
+                resource_refs.append(f"bin:{dep_lower}")
+        
+        return resource_refs
+
+    def _generate_provides(self, package_info: PackageInfo, build_system: BuildSystem) -> Dict[str, Any]:
+        """Generate the provides section of the manifest.
+        
+        Args:
+            package_info: Package information
+            build_system: Build system information
+            
+        Returns:
+            Dictionary of provided resources
+        """
+        provides = {}
+        
+        # Add common binaries
+        if package_info.name:
+            provides[f"bin:{package_info.name}"] = "as-expected"
+        
+        # Add libraries if it's a library package
+        if self._is_library_package(package_info, build_system):
+            lib_name = f"lib{package_info.name}" if not package_info.name.startswith("lib") else package_info.name
+            provides[f"lib:{lib_name}.so"] = "as-expected"
+        
+        # Add development files if applicable
+        if self._has_development_files(package_info):
+            inc_name = package_info.name.replace("lib", "")
+            provides[f"inc:{inc_name}.h"] = "as-expected"
+        
+        # Add documentation
+        provides[f"res:doc/{package_info.name}"] = "as-expected"
+        
+        # Add man pages if they exist
+        if self._has_man_pages(package_info):
+            provides[f"man:{package_info.name}.1"] = "as-expected"
+        
+        # Add desktop file for GUI applications
+        if self._is_gui_application(package_info):
+            provides[f"app:{package_info.name}.desktop"] = "as-expected"
+        
+        return provides
+
+    def _generate_executables(self, package_info: PackageInfo, build_system: BuildSystem) -> Executables:
+        """Generate the executables section of the manifest.
+        
+        Args:
+            package_info: Package information
+            build_system: Build system information
+            
+        Returns:
+            Executables object
+        """
+        return Executables(
+            build="scripts/build",
+            install="scripts/install",
+            acquire="scripts/acquire"
+        )
+
+    def _generate_flags(self, build_system: BuildSystem) -> List[str]:
+        """Generate the flags section of the manifest.
+        
+        Args:
+            build_system: Build system information
+            
+        Returns:
+            List of flags
+        """
+        flags = []
+        
+        # Add buildInSourceTree flag for certain build systems
+        if build_system.type in [BuildSystemType.MAKE, BuildSystemType.PYTHON]:
+            flags.append("buildInSourceTree")
+        
+        return flags
+
+    def _is_library_package(self, package_info: PackageInfo, build_system: BuildSystem) -> bool:
+        """Determine if this is a library package.
+        
+        Args:
+            package_info: Package information
+            build_system: Build system information
+            
+        Returns:
+            True if this is a library package
+        """
+        # Check package name
+        if package_info.name and package_info.name.startswith("lib"):
+            return True
+        
+        # Check for common library indicators
+        if package_info.source_dir:
+            source_path = Path(package_info.source_dir)
+            
+            # Look for include directories
+            if (source_path / "include").exists():
+                return True
+            
+            # Look for header files
+            for file_path in source_path.rglob("*.h"):
+                return True
+            
+            # Look for shared library files
+            for file_path in source_path.rglob("*.so"):
+                return True
+        
+        return False
+
+    def _has_development_files(self, package_info: PackageInfo) -> bool:
+        """Check if the package has development files.
+        
+        Args:
+            package_info: Package information
+            
+        Returns:
+            True if development files are present
+        """
+        if not package_info.source_dir:
+            return False
+        
+        source_path = Path(package_info.source_dir)
+        
+        # Look for header files
+        for file_path in source_path.rglob("*.h"):
+            return True
+        
+        # Look for pkg-config files
+        for file_path in source_path.rglob("*.pc"):
+            return True
+        
+        return False
+
+    def _has_man_pages(self, package_info: PackageInfo) -> bool:
+        """Check if the package has man pages.
+        
+        Args:
+            package_info: Package information
+            
+        Returns:
+            True if man pages are present
+        """
+        if not package_info.source_dir:
+            return False
+        
+        source_path = Path(package_info.source_dir)
+        
+        # Look for man directories
+        for path in source_path.rglob("man*"):
+            if path.is_dir():
+                return True
+        
+        return False
+
+    def _is_gui_application(self, package_info: PackageInfo) -> bool:
+        """Check if this is a GUI application.
+        
+        Args:
+            package_info: Package information
+            
+        Returns:
+            True if this is a GUI application
+        """
+        if not package_info.source_dir:
+            return False
+        
+        source_path = Path(package_info.source_dir)
+        
+        # Look for desktop files
+        for file_path in source_path.rglob("*.desktop"):
+            return True
+        
+        # Look for GUI-related dependencies
+        for dep in package_info.runtime_dependencies:
+            dep_lower = dep.lower()
+            if any(gui_lib in dep_lower for gui_lib in ["gtk", "qt", "wx", "fltk", "sdl"]):
+                return True
+        
+        return False
+
+    def _find_description_file(self, package_info: PackageInfo) -> Optional[str]:
+        """Find a description file for the package.
+        
+        Args:
+            package_info: Package information
+            
+        Returns:
+            Path to description file or None
+        """
+        if not package_info.source_dir:
+            return None
+        
+        source_path = Path(package_info.source_dir)
+        
+        # Common description file names
+        description_files = [
+            "README.md", "README.rst", "README.txt", "README",
+            "DESCRIPTION.md", "DESCRIPTION.rst", "DESCRIPTION.txt", "DESCRIPTION",
+            "NEWS.md", "NEWS.rst", "NEWS.txt", "NEWS",
+            "CHANGELOG.md", "CHANGELOG.rst", "CHANGELOG.txt", "CHANGELOG"
+        ]
+        
+        for filename in description_files:
+            file_path = source_path / filename
+            if file_path.exists():
+                return filename
+        
+        return None
+
+    def update_with_autoprovides(self, manifest: USMManifest, autoprovides: Dict[str, Any]) -> USMManifest:
+        """Update a manifest with autoprovides from USM.
+        
+        Args:
+            manifest: Existing USM manifest
+            autoprovides: Autoprovides dictionary from USM
+            
+        Returns:
+            Updated USM manifest
+        """
+        # Merge autoprovides into the provides section
+        for resource_ref, resource in autoprovides.items():
+            if resource_ref not in manifest.provides:
+                manifest.provides[resource_ref] = resource
+        
+        return manifest
+
+    def validate_manifest(self, manifest: USMManifest) -> List[str]:
+        """Validate a USM manifest.
+        
+        Args:
+            manifest: USM manifest to validate
+            
+        Returns:
+            List of validation errors
+        """
+        errors = []
+        
+        # Check required fields
+        if not manifest.name:
+            errors.append("Package name is required")
+        elif " " in manifest.name:
+            errors.append("Package name cannot contain spaces")
+        
+        if not manifest.version:
+            errors.append("Package version is required")
+        
+        if not manifest.summary:
+            errors.append("Package summary is required")
+        
+        if not manifest.licences:
+            errors.append("At least one license is required")
+        
+        # Check license format
+        for license in manifest.licences:
+            if not license.name:
+                errors.append("License name is required")
+            if not license.text:
+                errors.append("License text path is required")
+            if not license.category:
+                errors.append("License category is required")
+        
+        # Check provides section
+        if not manifest.provides:
+            errors.append("At least one provided resource is required")
+        
+        # Check dependencies
+        if not manifest.depends.runtime:
+            errors.append("At least one runtime dependency is required")
+        if not manifest.depends.build:
+            errors.append("At least one build dependency is required")
+        if not manifest.depends.manage:
+            errors.append("At least one manage dependency is required")
+        
+        # Check executables
+        if not manifest.execs.build:
+            errors.append("Build script path is required")
+        
+        return errors

+ 758 - 0
src/autusm/metadata.py

@@ -0,0 +1,758 @@
+"""
+Metadata extractor for autusm.
+
+This module provides functionality to extract metadata from various
+package configuration files like package.json, setup.py, Cargo.toml, etc.
+"""
+
+import os
+import json
+import logging
+import re
+import ast
+import toml
+from pathlib import Path
+from typing import Dict, List, Optional, Any, Tuple
+
+from .models import PackageInfo, License, LicenseCategory
+from .exceptions import MetadataError
+
+
+logger = logging.getLogger(__name__)
+
+
+class MetadataExtractor:
+    """Extractor for package metadata from various configuration files."""
+
+    def __init__(self):
+        """Initialize the metadata extractor."""
+        # Define file patterns for different package types
+        self.package_patterns = {
+            "python": [
+                "setup.py",
+                "pyproject.toml",
+                "setup.cfg",
+                "requirements.txt"
+            ],
+            "rust": [
+                "Cargo.toml",
+                "Cargo.lock"
+            ],
+            "node": [
+                "package.json",
+                "package-lock.json",
+                "npm-shrinkwrap.json"
+            ],
+            "ruby": [
+                "Gemfile",
+                "*.gemspec"
+            ],
+            "perl": [
+                "Makefile.PL",
+                "META.json",
+                "META.yml"
+            ],
+            "php": [
+                "composer.json"
+            ],
+            "java": [
+                "pom.xml",
+                "build.gradle"
+            ]
+        }
+
+        # License mapping for common license identifiers
+        self.license_mapping = {
+            "MIT": LicenseCategory.LIBRE,
+            "Apache-2.0": LicenseCategory.LIBRE,
+            "Apache-2.0": LicenseCategory.LIBRE,
+            "GPL-2.0": LicenseCategory.LIBRE,
+            "GPL-3.0": LicenseCategory.LIBRE,
+            "LGPL-2.1": LicenseCategory.LIBRE,
+            "LGPL-3.0": LicenseCategory.LIBRE,
+            "BSD-2-Clause": LicenseCategory.LIBRE,
+            "BSD-3-Clause": LicenseCategory.LIBRE,
+            "ISC": LicenseCategory.LIBRE,
+            "MPL-2.0": LicenseCategory.LIBRE,
+            "AGPL-3.0": LicenseCategory.LIBRE,
+            "BSL-1.0": LicenseCategory.LIBRE,
+            "Unlicense": LicenseCategory.LIBRE,
+            "CC0-1.0": LicenseCategory.LIBRE,
+            "EPL-1.0": LicenseCategory.LIBRE,
+            "EPL-2.0": LicenseCategory.LIBRE,
+            "Proprietary": LicenseCategory.PROPRIETARY,
+            "Commercial": LicenseCategory.PROPRIETARY
+        }
+
+    def extract(self, source_dir: Path) -> PackageInfo:
+        """Extract metadata from a source directory.
+        
+        Args:
+            source_dir: Path to the source directory
+            
+        Returns:
+            PackageInfo object with extracted metadata
+            
+        Raises:
+            MetadataError: If extraction fails
+        """
+        try:
+            logger.info(f"Extracting metadata from {source_dir}")
+            
+            package_info = PackageInfo()
+            
+            # Find and process package files
+            package_files = self._find_package_files(source_dir)
+            
+            for file_path in package_files:
+                relative_path = file_path.relative_to(source_dir)
+                package_info.metadata_files.append(str(relative_path))
+                
+                # Extract metadata based on file type
+                if file_path.name == "package.json":
+                    self._extract_from_package_json(file_path, package_info)
+                elif file_path.name == "setup.py":
+                    self._extract_from_setup_py(file_path, package_info)
+                elif file_path.name == "pyproject.toml":
+                    self._extract_from_pyproject_toml(file_path, package_info)
+                elif file_path.name == "Cargo.toml":
+                    self._extract_from_cargo_toml(file_path, package_info)
+                elif file_path.name == "composer.json":
+                    self._extract_from_composer_json(file_path, package_info)
+                elif file_path.name == "pom.xml":
+                    self._extract_from_pom_xml(file_path, package_info)
+                elif file_path.name == "build.gradle":
+                    self._extract_from_build_gradle(file_path, package_info)
+            
+            # Extract additional metadata from common locations
+            self._extract_from_readme(source_dir, package_info)
+            self._extract_from_license_files(source_dir, package_info)
+            self._extract_from_git_info(source_dir, package_info)
+            
+            # If we still don't have a name, try to derive from directory
+            if not package_info.name:
+                package_info.name = source_dir.name.lower().replace('-', '_').replace(' ', '_')
+            
+            logger.info(f"Extracted metadata for package: {package_info.name}")
+            return package_info
+            
+        except Exception as e:
+            logger.error(f"Failed to extract metadata: {e}")
+            raise MetadataError(f"Failed to extract metadata: {e}")
+
+    def _find_package_files(self, source_dir: Path) -> List[Path]:
+        """Find package configuration files in the source directory.
+        
+        Args:
+            source_dir: Path to the source directory
+            
+        Returns:
+            List of package configuration file paths
+        """
+        package_files = []
+        
+        for root, dirs, files in os.walk(source_dir):
+            # Skip hidden directories and common build directories
+            dirs[:] = [d for d in dirs if not d.startswith('.') and d not in ['build', 'target', 'node_modules', '__pycache__']]
+            
+            for file in files:
+                file_path = Path(root) / file
+                
+                # Check if this is a package file
+                for package_type, patterns in self.package_patterns.items():
+                    for pattern in patterns:
+                        if self._match_pattern(file, pattern):
+                            package_files.append(file_path)
+                            break
+        
+        return package_files
+
+    def _match_pattern(self, filename: str, pattern: str) -> bool:
+        """Check if a filename matches a pattern.
+        
+        Args:
+            filename: Name of the file
+            pattern: Pattern to match (can include wildcards)
+            
+        Returns:
+            True if the filename matches the pattern
+        """
+        if pattern.startswith("*."):
+            return filename.endswith(pattern[1:])
+        else:
+            return filename == pattern
+
+    def _extract_from_package_json(self, file_path: Path, package_info: PackageInfo) -> None:
+        """Extract metadata from package.json file.
+        
+        Args:
+            file_path: Path to the package.json file
+            package_info: PackageInfo object to update
+        """
+        try:
+            with open(file_path, 'r', encoding='utf-8') as f:
+                data = json.load(f)
+            
+            # Extract basic information
+            if not package_info.name and "name" in data:
+                package_info.name = data["name"]
+            
+            if not package_info.version and "version" in data:
+                package_info.version = data["version"]
+            
+            if not package_info.summary and "description" in data:
+                package_info.summary = data["description"]
+            
+            if not package_info.url and "homepage" in data:
+                package_info.url = data["homepage"]
+            elif not package_info.url and "repository" in data:
+                if isinstance(data["repository"], dict) and "url" in data["repository"]:
+                    package_info.url = data["repository"]["url"]
+                elif isinstance(data["repository"], str):
+                    package_info.url = data["repository"]
+            
+            # Extract author information
+            if "author" in data:
+                author = data["author"]
+                if isinstance(author, dict):
+                    author_name = author.get("name", "")
+                    if author_name:
+                        package_info.authors.append(author_name)
+                elif isinstance(author, str):
+                    package_info.authors.append(author)
+            
+            # Extract license information
+            if "license" in data:
+                license_name = data["license"]
+                category = self._map_license_category(license_name)
+                package_info.licenses.append(
+                    License(name=license_name, text="LICENSE", category=category)
+                )
+            
+            # Extract dependencies
+            if "dependencies" in data:
+                package_info.runtime_dependencies.extend(data["dependencies"].keys())
+            
+            if "devDependencies" in data:
+                package_info.build_dependencies.extend(data["devDependencies"].keys())
+            
+        except Exception as e:
+            logger.warning(f"Failed to extract metadata from {file_path}: {e}")
+
+    def _extract_from_setup_py(self, file_path: Path, package_info: PackageInfo) -> None:
+        """Extract metadata from setup.py file.
+        
+        Args:
+            file_path: Path to the setup.py file
+            package_info: PackageInfo object to update
+        """
+        try:
+            with open(file_path, 'r', encoding='utf-8') as f:
+                content = f.read()
+            
+            # Parse the setup.py file to extract setup() arguments
+            tree = ast.parse(content)
+            
+            for node in ast.walk(tree):
+                if isinstance(node, ast.Call) and hasattr(node.func, 'id') and node.func.id == 'setup':
+                    # Extract keyword arguments
+                    for keyword in node.keywords:
+                        arg_name = keyword.arg
+                        arg_value = self._extract_ast_value(keyword.value)
+                        
+                        if arg_name == "name" and not package_info.name:
+                            package_info.name = arg_value
+                        elif arg_name == "version" and not package_info.version:
+                            package_info.version = arg_value
+                        elif arg_name == "description" and not package_info.summary:
+                            package_info.summary = arg_value
+                        elif arg_name == "url" and not package_info.url:
+                            package_info.url = arg_value
+                        elif arg_name == "author" and arg_value:
+                            package_info.authors.append(arg_value)
+                        elif arg_name == "license" and arg_value:
+                            category = self._map_license_category(arg_value)
+                            package_info.licenses.append(
+                                License(name=arg_value, text="LICENSE", category=category)
+                            )
+                        elif arg_name == "install_requires" and arg_value:
+                            if isinstance(arg_value, list):
+                                package_info.runtime_dependencies.extend(arg_value)
+                        elif arg_name == "setup_requires" and arg_value:
+                            if isinstance(arg_value, list):
+                                package_info.build_dependencies.extend(arg_value)
+                    
+                    break
+                    
+        except Exception as e:
+            logger.warning(f"Failed to extract metadata from {file_path}: {e}")
+
+    def _extract_from_pyproject_toml(self, file_path: Path, package_info: PackageInfo) -> None:
+        """Extract metadata from pyproject.toml file.
+        
+        Args:
+            file_path: Path to the pyproject.toml file
+            package_info: PackageInfo object to update
+        """
+        try:
+            with open(file_path, 'r', encoding='utf-8') as f:
+                data = toml.load(f)
+            
+            # Extract from [project] section (PEP 621)
+            if "project" in data:
+                project = data["project"]
+                
+                if not package_info.name and "name" in project:
+                    package_info.name = project["name"]
+                
+                if not package_info.version and "version" in project:
+                    package_info.version = project["version"]
+                
+                if not package_info.summary and "description" in project:
+                    package_info.summary = project["description"]
+                
+                if "authors" in project:
+                    for author in project["authors"]:
+                        if isinstance(author, dict) and "name" in author:
+                            package_info.authors.append(author["name"])
+                        elif isinstance(author, str):
+                            package_info.authors.append(author)
+                
+                if "license" in project:
+                    if isinstance(project["license"], dict) and "text" in project["license"]:
+                        license_name = project["license"]["text"]
+                    else:
+                        license_name = str(project["license"])
+                    
+                    category = self._map_license_category(license_name)
+                    package_info.licenses.append(
+                        License(name=license_name, text="LICENSE", category=category)
+                    )
+                
+                if "dependencies" in project:
+                    package_info.runtime_dependencies.extend(project["dependencies"])
+            
+            # Extract from [tool.poetry] section
+            if "tool" in data and "poetry" in data["tool"]:
+                poetry = data["tool"]["poetry"]
+                
+                if not package_info.name and "name" in poetry:
+                    package_info.name = poetry["name"]
+                
+                if not package_info.version and "version" in poetry:
+                    package_info.version = poetry["version"]
+                
+                if not package_info.summary and "description" in poetry:
+                    package_info.summary = poetry["description"]
+                
+                if "authors" in poetry:
+                    package_info.authors.extend(poetry["authors"])
+                
+                if "license" in poetry:
+                    license_name = poetry["license"]
+                    category = self._map_license_category(license_name)
+                    package_info.licenses.append(
+                        License(name=license_name, text="LICENSE", category=category)
+                    )
+                
+                if "dependencies" in poetry:
+                    package_info.runtime_dependencies.extend(poetry["dependencies"].keys())
+                
+                if "dev-dependencies" in poetry:
+                    package_info.build_dependencies.extend(poetry["dev-dependencies"].keys())
+                    
+        except Exception as e:
+            logger.warning(f"Failed to extract metadata from {file_path}: {e}")
+
+    def _extract_from_cargo_toml(self, file_path: Path, package_info: PackageInfo) -> None:
+        """Extract metadata from Cargo.toml file.
+        
+        Args:
+            file_path: Path to the Cargo.toml file
+            package_info: PackageInfo object to update
+        """
+        try:
+            with open(file_path, 'r', encoding='utf-8') as f:
+                data = toml.load(f)
+            
+            # Extract from [package] section
+            if "package" in data:
+                package = data["package"]
+                
+                if not package_info.name and "name" in package:
+                    package_info.name = package["name"]
+                
+                if not package_info.version and "version" in package:
+                    package_info.version = package["version"]
+                
+                if not package_info.summary and "description" in package:
+                    package_info.summary = package["description"]
+                
+                if "authors" in package:
+                    package_info.authors.extend(package["authors"])
+                
+                if "license" in package:
+                    license_name = package["license"]
+                    category = self._map_license_category(license_name)
+                    package_info.licenses.append(
+                        License(name=license_name, text="LICENSE", category=category)
+                    )
+                
+                if "repository" in package:
+                    package_info.url = package["repository"]
+            
+            # Extract dependencies
+            if "dependencies" in data:
+                package_info.runtime_dependencies.extend(data["dependencies"].keys())
+            
+            if "dev-dependencies" in data:
+                package_info.build_dependencies.extend(data["dev-dependencies"].keys())
+                
+        except Exception as e:
+            logger.warning(f"Failed to extract metadata from {file_path}: {e}")
+
+    def _extract_from_composer_json(self, file_path: Path, package_info: PackageInfo) -> None:
+        """Extract metadata from composer.json file.
+        
+        Args:
+            file_path: Path to the composer.json file
+            package_info: PackageInfo object to update
+        """
+        try:
+            with open(file_path, 'r', encoding='utf-8') as f:
+                data = json.load(f)
+            
+            # Extract basic information
+            if not package_info.name and "name" in data:
+                package_info.name = data["name"]
+            
+            if not package_info.version and "version" in data:
+                package_info.version = data["version"]
+            
+            if not package_info.summary and "description" in data:
+                package_info.summary = data["description"]
+            
+            if not package_info.url and "homepage" in data:
+                package_info.url = data["homepage"]
+            
+            # Extract author information
+            if "authors" in data:
+                for author in data["authors"]:
+                    if isinstance(author, dict) and "name" in author:
+                        package_info.authors.append(author["name"])
+            
+            # Extract license information
+            if "license" in data:
+                license_data = data["license"]
+                if isinstance(license_data, list):
+                    for license_name in license_data:
+                        category = self._map_license_category(license_name)
+                        package_info.licenses.append(
+                            License(name=license_name, text="LICENSE", category=category)
+                        )
+                else:
+                    category = self._map_license_category(license_data)
+                    package_info.licenses.append(
+                        License(name=license_data, text="LICENSE", category=category)
+                    )
+            
+            # Extract dependencies
+            if "require" in data:
+                package_info.runtime_dependencies.extend(data["require"].keys())
+            
+            if "require-dev" in data:
+                package_info.build_dependencies.extend(data["require-dev"].keys())
+                
+        except Exception as e:
+            logger.warning(f"Failed to extract metadata from {file_path}: {e}")
+
+    def _extract_from_pom_xml(self, file_path: Path, package_info: PackageInfo) -> None:
+        """Extract metadata from pom.xml file.
+        
+        Args:
+            file_path: Path to the pom.xml file
+            package_info: PackageInfo object to update
+        """
+        # This is a simplified implementation
+        # In a real implementation, you would use an XML parser
+        try:
+            with open(file_path, 'r', encoding='utf-8') as f:
+                content = f.read()
+            
+            # Use regex to extract basic information
+            # This is not robust but serves as a starting point
+            if not package_info.name:
+                name_match = re.search(r'<artifactId>(.*?)</artifactId>', content)
+                if name_match:
+                    package_info.name = name_match.group(1)
+            
+            if not package_info.version:
+                version_match = re.search(r'<version>(.*?)</version>', content)
+                if version_match:
+                    package_info.version = version_match.group(1)
+            
+            if not package_info.summary:
+                desc_match = re.search(r'<description>(.*?)</description>', content)
+                if desc_match:
+                    package_info.summary = desc_match.group(1)
+            
+            if not package_info.url:
+                url_match = re.search(r'<url>(.*?)</url>', content)
+                if url_match:
+                    package_info.url = url_match.group(1)
+                    
+        except Exception as e:
+            logger.warning(f"Failed to extract metadata from {file_path}: {e}")
+
+    def _extract_from_build_gradle(self, file_path: Path, package_info: PackageInfo) -> None:
+        """Extract metadata from build.gradle file.
+        
+        Args:
+            file_path: Path to the build.gradle file
+            package_info: PackageInfo object to update
+        """
+        # This is a simplified implementation
+        try:
+            with open(file_path, 'r', encoding='utf-8') as f:
+                content = f.read()
+            
+            # Use regex to extract basic information
+            if not package_info.name:
+                name_match = re.search(r'archivesBaseName\s*=\s*["\']([^"\']+)["\']', content)
+                if name_match:
+                    package_info.name = name_match.group(1)
+            
+            if not package_info.version:
+                version_match = re.search(r'version\s*=\s*["\']([^"\']+)["\']', content)
+                if version_match:
+                    package_info.version = version_match.group(1)
+            
+            if not package_info.summary:
+                desc_match = re.search(r'description\s*=\s*["\']([^"\']+)["\']', content)
+                if desc_match:
+                    package_info.summary = desc_match.group(1)
+                    
+        except Exception as e:
+            logger.warning(f"Failed to extract metadata from {file_path}: {e}")
+
+    def _extract_from_readme(self, source_dir: Path, package_info: PackageInfo) -> None:
+        """Extract information from README files.
+        
+        Args:
+            source_dir: Path to the source directory
+            package_info: PackageInfo object to update
+        """
+        readme_files = [
+            "README", "README.md", "README.txt", "README.rst",
+            "readme", "readme.md", "readme.txt", "readme.rst"
+        ]
+        
+        for readme_name in readme_files:
+            readme_path = source_dir / readme_name
+            if readme_path.exists():
+                try:
+                    with open(readme_path, 'r', encoding='utf-8') as f:
+                        content = f.read()
+                    
+                    # Extract the first paragraph as description if not already set
+                    if not package_info.description:
+                        lines = content.split('\n')
+                        description_lines = []
+                        for line in lines:
+                            line = line.strip()
+                            if line and not line.startswith('#'):
+                                description_lines.append(line)
+                            elif description_lines:
+                                break
+                        
+                        if description_lines:
+                            package_info.description = ' '.join(description_lines)
+                    
+                    # Extract project URL if not already set
+                    if not package_info.url:
+                        # Look for common URL patterns
+                        url_patterns = [
+                            r'github\.com/([^\s/]+/[^\s/]+)',
+                            r'gitlab\.com/([^\s/]+/[^\s/]+)',
+                            r'https?://([^\s]+)'
+                        ]
+                        
+                        for pattern in url_patterns:
+                            match = re.search(pattern, content)
+                            if match:
+                                if 'github' in pattern or 'gitlab' in pattern:
+                                    package_info.url = f"https://{match.group(0)}"
+                                else:
+                                    package_info.url = match.group(0)
+                                break
+                    
+                    break  # Use the first README file found
+                    
+                except Exception as e:
+                    logger.warning(f"Failed to extract information from {readme_path}: {e}")
+
+    def _extract_from_license_files(self, source_dir: Path, package_info: PackageInfo) -> None:
+        """Extract information from license files.
+        
+        Args:
+            source_dir: Path to the source directory
+            package_info: PackageInfo object to update
+        """
+        license_files = [
+            "LICENSE", "LICENSE.txt", "LICENSE.md",
+            "COPYING", "COPYING.txt",
+            "license", "license.txt", "license.md",
+            "copying", "copying.txt"
+        ]
+        
+        for license_name in license_files:
+            license_path = source_dir / license_name
+            if license_path.exists():
+                try:
+                    with open(license_path, 'r', encoding='utf-8') as f:
+                        content = f.read()
+                    
+                    # Try to identify the license type
+                    license_type = self._identify_license_type(content)
+                    if license_type:
+                        category = self._map_license_category(license_type)
+                        
+                        # Check if we already have this license
+                        for license in package_info.licenses:
+                            if license.name == license_type:
+                                break
+                        else:
+                            package_info.licenses.append(
+                                License(name=license_type, text=license_name, category=category)
+                            )
+                    
+                    break  # Use the first license file found
+                    
+                except Exception as e:
+                    logger.warning(f"Failed to extract information from {license_path}: {e}")
+
+    def _extract_from_git_info(self, source_dir: Path, package_info: PackageInfo) -> None:
+        """Extract information from git repository.
+        
+        Args:
+            source_dir: Path to the source directory
+            package_info: PackageInfo object to update
+        """
+        git_dir = source_dir / '.git'
+        if not git_dir.exists():
+            return
+        
+        try:
+            # Try to get remote URL
+            git_config = git_dir / 'config'
+            if git_config.exists():
+                with open(git_config, 'r', encoding='utf-8') as f:
+                    content = f.read()
+                
+                # Look for remote URL
+                url_match = re.search(r'url\s*=\s*(.+)', content)
+                if url_match and not package_info.url:
+                    url = url_match.group(1).strip()
+                    # Convert git@ to https:// if needed
+                    if url.startswith('git@'):
+                        url = url.replace(':', '/').replace('git@', 'https://')
+                    package_info.url = url
+                    
+        except Exception as e:
+            logger.warning(f"Failed to extract git information: {e}")
+
+    def _extract_ast_value(self, node) -> Optional[str]:
+        """Extract a string value from an AST node.
+        
+        Args:
+            node: AST node
+            
+        Returns:
+            String value or None
+        """
+        if isinstance(node, ast.Constant) and isinstance(node.value, str):
+            return node.value
+        # For backward compatibility with older Python versions
+        elif hasattr(ast, 'Str') and isinstance(node, ast.Str):
+            return node.s
+        elif isinstance(node, ast.List):
+            items = []
+            for item in node.elts:
+                value = self._extract_ast_value(item)
+                if value:
+                    items.append(value)
+            return items
+        return None
+
+    def _identify_license_type(self, content: str) -> Optional[str]:
+        """Identify the license type from license file content.
+        
+        Args:
+            content: Content of the license file
+            
+        Returns:
+            License type string or None
+        """
+        # Simple keyword-based license detection
+        content_lower = content.lower()
+        
+        if "mit license" in content_lower or "permission is hereby granted" in content_lower:
+            return "MIT"
+        elif "apache license" in content_lower or "apache-2.0" in content_lower:
+            return "Apache-2.0"
+        elif "gnu general public license" in content_lower or "gpl" in content_lower:
+            if "version 3" in content_lower:
+                return "GPL-3.0"
+            elif "version 2" in content_lower:
+                return "GPL-2.0"
+            else:
+                return "GPL"
+        elif "gnu lesser general public license" in content_lower or "lgpl" in content_lower:
+            if "version 3" in content_lower:
+                return "LGPL-3.0"
+            elif "version 2" in content_lower:
+                return "LGPL-2.1"
+            else:
+                return "LGPL"
+        elif "bsd license" in content_lower:
+            if "3-clause" in content_lower:
+                return "BSD-3-Clause"
+            elif "2-clause" in content_lower:
+                return "BSD-2-Clause"
+            else:
+                return "BSD"
+        elif "mozilla public license" in content_lower or "mpl" in content_lower:
+            return "MPL-2.0"
+        elif "boost software license" in content_lower:
+            return "BSL-1.0"
+        elif "unlicense" in content_lower:
+            return "Unlicense"
+        elif "creative commons" in content_lower or "cc0" in content_lower:
+            return "CC0-1.0"
+        
+        return None
+
+    def _map_license_category(self, license_name: str) -> LicenseCategory:
+        """Map a license name to a license category.
+        
+        Args:
+            license_name: Name of the license
+            
+        Returns:
+            LicenseCategory enum value
+        """
+        # Direct mapping
+        if license_name in self.license_mapping:
+            return self.license_mapping[license_name]
+        
+        # Check if the license name contains known keywords
+        license_lower = license_name.lower()
+        
+        if "mit" in license_lower or "apache" in license_lower or "bsd" in license_lower:
+            return LicenseCategory.LIBRE
+        elif "gpl" in license_lower or "lgpl" in license_lower or "agpl" in license_lower:
+            return LicenseCategory.LIBRE
+        elif "proprietary" in license_lower or "commercial" in license_lower:
+            return LicenseCategory.PROPRIETARY
+        else:
+            return LicenseCategory.SOURCE_AVAILABLE

+ 295 - 0
src/autusm/models.py

@@ -0,0 +1,295 @@
+"""
+Data models for autusm package.
+
+This module defines the core data structures used throughout the autusm
+package, including PackageInfo, BuildSystem, and USMManifest.
+"""
+
+import json
+from enum import Enum
+from typing import Dict, List, Optional, Union, Any
+from dataclasses import dataclass, field
+
+
+class BuildSystemType(Enum):
+    """Enumeration of supported build systems."""
+    AUTOTOOLS = "autotools"
+    CMAKE = "cmake"
+    MESON = "meson"
+    MAKE = "make"
+    PYTHON = "python"
+    CARGO = "cargo"
+    NPM = "npm"
+    UNKNOWN = "unknown"
+
+
+class LicenseCategory(Enum):
+    """Enumeration of license categories as defined in USM spec."""
+    LIBRE = "libre"
+    OPEN_SOURCE = "open-source"
+    SOURCE_AVAILABLE = "source-available"
+    PROPRIETARY = "proprietary"
+
+
+class ResourceType(Enum):
+    """Enumeration of resource types as defined in USM spec."""
+    ROOTPATH = "rootpath"
+    PATH = "path"
+    OPT = "opt"
+    RES = "res"
+    CFG = "cfg"
+    BIN = "bin"
+    SBIN = "sbin"
+    LIB = "lib"
+    LIBEXEC = "libexec"
+    LIBRES = "libres"
+    INFO = "info"
+    MAN = "man"
+    LOCALE = "locale"
+    APP = "app"
+    INC = "inc"
+    PC = "pc"
+    VAPI = "vapi"
+    GIR = "gir"
+    TYPELIB = "typelib"
+    TAG = "tag"
+
+
+class PathBase(Enum):
+    """Enumeration of path base options as defined in USM spec."""
+    SOURCE = "source"
+    BUILD = "build"
+    INSTALL = "install"
+    AS_EXPECTED = "as-expected"
+
+
+class FileType(Enum):
+    """Enumeration of file types for resources."""
+    REG = "reg"
+    DIR = "dir"
+    LNK = "lnk"
+
+
+@dataclass
+class License:
+    """License information for a package."""
+    name: str
+    text: str  # Path to license file relative to MANIFEST.usm
+    category: LicenseCategory
+
+
+@dataclass
+class Resource:
+    """Resource provided by a package."""
+    path: Optional[str] = None
+    path_base: Optional[PathBase] = None
+    type: FileType = FileType.REG
+    dest: Optional[str] = None  # For symlinks
+    keep_on: List[str] = field(default_factory=list)
+    skip_for: List[str] = field(default_factory=list)
+
+
+@dataclass
+class Dependencies:
+    """Package dependencies."""
+    runtime: List[str] = field(default_factory=list)
+    build: List[str] = field(default_factory=list)
+    manage: List[str] = field(default_factory=list)
+    acquire: List[str] = field(default_factory=list)
+
+
+@dataclass
+class Executables:
+    """Executable scripts for different phases."""
+    build: str
+    install: Optional[str] = None
+    remove: Optional[str] = None
+    post_install: Optional[str] = None
+    acquire: Optional[str] = None
+
+
+@dataclass
+class GitInfo:
+    """Git repository information."""
+    origin: str
+    commit: str
+
+
+@dataclass
+class BuildSystem:
+    """Build system information."""
+    type: BuildSystemType
+    config_files: List[str] = field(default_factory=list)
+    build_files: List[str] = field(default_factory=list)
+    detected_commands: List[str] = field(default_factory=list)
+    custom_args: Dict[str, str] = field(default_factory=dict)
+
+
+@dataclass
+class PackageInfo:
+    """Information extracted from a source package."""
+    name: Optional[str] = None
+    version: Optional[str] = None
+    summary: Optional[str] = None
+    description: Optional[str] = None
+    url: Optional[str] = None
+    licenses: List[License] = field(default_factory=list)
+    authors: List[str] = field(default_factory=list)
+    maintainers: List[str] = field(default_factory=list)
+    build_system: Optional[BuildSystem] = None
+    dependencies: Dependencies = field(default_factory=Dependencies)
+    provides: List[str] = field(default_factory=list)
+    build_dependencies: List[str] = field(default_factory=list)
+    runtime_dependencies: List[str] = field(default_factory=list)
+    source_dir: Optional[str] = None
+    metadata_files: List[str] = field(default_factory=list)
+    extra_data: Dict[str, Any] = field(default_factory=dict)
+
+
+@dataclass
+class USMManifest:
+    """USM Manifest structure according to USM specification."""
+    name: str
+    version: str
+    summary: str
+    licences: List[License]
+    provides: Dict[str, Union[Resource, str]]  # ResourceRef -> Resource or shorthand
+    depends: Dependencies
+    flags: List[str]
+    execs: Executables
+    md: Optional[str] = None
+    url: Optional[str] = None
+    screenshots: List[str] = field(default_factory=list)
+    icon: Optional[str] = None
+    metainfo: Optional[str] = None
+    git: Optional[GitInfo] = None
+    extras: Dict[str, Any] = field(default_factory=dict)
+
+    def to_dict(self) -> Dict[str, Any]:
+        """Convert the manifest to a dictionary for JSON serialization."""
+        result = {
+            "name": self.name,
+            "version": self.version,
+            "summary": self.summary,
+            "licences": [
+                {
+                    "name": lic.name,
+                    "text": lic.text,
+                    "category": lic.category.value
+                }
+                for lic in self.licences
+            ],
+            "provides": {},
+            "depends": {
+                "runtime": self.depends.runtime,
+                "build": self.depends.build,
+                "manage": self.depends.manage,
+            },
+            "flags": self.flags,
+            "execs": {
+                "build": self.execs.build,
+            }
+        }
+
+        # Add optional acquire dependency if present
+        if self.depends.acquire:
+            result["depends"]["acquire"] = self.depends.acquire
+
+        # Add optional execs
+        if self.execs.install:
+            result["execs"]["install"] = self.execs.install
+        if self.execs.remove:
+            result["execs"]["remove"] = self.execs.remove
+        if self.execs.post_install:
+            result["execs"]["postInstall"] = self.execs.post_install
+        if self.execs.acquire:
+            result["execs"]["acquire"] = self.execs.acquire
+
+        # Process provides - convert Resource objects to dictionaries or use shorthand
+        for key, value in self.provides.items():
+            if isinstance(value, Resource):
+                resource_dict = {"type": value.type.value}
+                if value.path is not None:
+                    resource_dict["path"] = value.path
+                if value.path_base is not None:
+                    resource_dict["pathBase"] = value.path_base.value
+                if value.dest is not None:
+                    resource_dict["dest"] = value.dest
+                if value.keep_on:
+                    resource_dict["keepOn"] = value.keep_on
+                if value.skip_for:
+                    resource_dict["skipFor"] = value.skip_for
+                result["provides"][key] = resource_dict
+            else:
+                # Use shorthand string directly
+                result["provides"][key] = value
+
+        # Add optional fields
+        if self.md:
+            result["md"] = self.md
+        if self.url:
+            result["url"] = self.url
+        if self.screenshots:
+            result["screenshots"] = self.screenshots
+        if self.icon:
+            result["icon"] = self.icon
+        if self.metainfo:
+            result["metainfo"] = self.metainfo
+        if self.git:
+            result["git"] = {
+                "origin": self.git.origin,
+                "commit": self.git.commit
+            }
+        if self.extras:
+            result["extras"] = self.extras
+
+        return result
+
+    def to_json(self, indent: int = 2) -> str:
+        """Convert the manifest to a JSON string."""
+        return json.dumps(self.to_dict(), indent=indent)
+
+    @classmethod
+    def from_package_info(cls, package_info: PackageInfo) -> "USMManifest":
+        """Create a USMManifest from PackageInfo."""
+        # This is a simplified conversion - in practice, more logic would be needed
+        # to properly map PackageInfo to USMManifest
+        
+        if not package_info.name:
+            raise ValueError("Package name is required")
+        if not package_info.version:
+            raise ValueError("Package version is required")
+        if not package_info.summary:
+            raise ValueError("Package summary is required")
+
+        # Default dependencies
+        dependencies = Dependencies(
+            runtime=package_info.runtime_dependencies,
+            build=package_info.build_dependencies,
+            manage=[],
+            acquire=[]
+        )
+
+        # Default provides - this would need more sophisticated detection
+        provides = {}
+
+        # Default flags
+        flags = []
+
+        # Default executables - this would need to be generated based on build system
+        execs = Executables(
+            build="scripts/build"
+        )
+
+        return cls(
+            name=package_info.name,
+            version=package_info.version,
+            summary=package_info.summary,
+            licences=package_info.licenses,
+            provides=provides,
+            depends=dependencies,
+            flags=flags,
+            execs=execs,
+            md=package_info.description,
+            url=package_info.url
+        )

+ 393 - 0
src/autusm/usm_integration.py

@@ -0,0 +1,393 @@
+"""
+USM integration for autusm.
+
+This module provides functionality to integrate with the USM package manager,
+including running "usm manifest autoprovides" and parsing the output.
+"""
+
+import json
+import logging
+import subprocess
+from pathlib import Path
+from typing import Dict, List, Optional, Any
+
+from .models import Resource, PathBase, FileType
+from .exceptions import USMIntegrationError
+
+
+logger = logging.getLogger(__name__)
+
+
+class USMIntegration:
+    """Integration with USM package manager."""
+
+    def __init__(self):
+        """Initialize USM integration."""
+        self.usm_command = "usm"
+
+    def is_available(self) -> bool:
+        """Check if USM is available on the system.
+        
+        Returns:
+            True if USM is available, False otherwise
+        """
+        try:
+            # Try to run usm --version
+            result = subprocess.run(
+                [self.usm_command, "--version"],
+                capture_output=True,
+                text=True,
+                timeout=10
+            )
+            return result.returncode == 0
+        except (subprocess.TimeoutExpired, subprocess.SubprocessError, FileNotFoundError):
+            return False
+
+    def get_autoprovides(self, source_dir: Path) -> Dict[str, Any]:
+        """Get autoprovides from USM for a source directory.
+        
+        Args:
+            source_dir: Path to the source directory
+            
+        Returns:
+            Dictionary of autoprovides resources
+            
+        Raises:
+            USMIntegrationError: If USM integration fails
+        """
+        try:
+            logger.info(f"Getting autoprovides from USM for {source_dir}")
+            
+            # Run usm manifest autoprovides
+            cmd = [self.usm_command, "manifest", "autoprovides"]
+            result = subprocess.run(
+                cmd,
+                cwd=source_dir,
+                capture_output=True,
+                text=True,
+                timeout=300  # 5 minutes timeout
+            )
+            
+            if result.returncode != 0:
+                logger.warning(f"USM autoprovides failed: {result.stderr}")
+                return {}
+            
+            # Parse the output
+            autoprovides = self._parse_autoprovides(result.stdout)
+            
+            logger.info(f"Got {len(autoprovides)} autoprovides from USM")
+            return autoprovides
+            
+        except subprocess.TimeoutExpired:
+            logger.error("USM autoprovides timed out")
+            raise USMIntegrationError("USM autoprovides timed out")
+        except subprocess.SubprocessError as e:
+            logger.error(f"USM autoprovides failed: {e}")
+            raise USMIntegrationError(f"USM autoprovides failed: {e}")
+        except Exception as e:
+            logger.error(f"Unexpected error in USM integration: {e}")
+            raise USMIntegrationError(f"Unexpected error in USM integration: {e}")
+
+    def _parse_autoprovides(self, output: str) -> Dict[str, Any]:
+        """Parse the output of "usm manifest autoprovides".
+        
+        Args:
+            output: Output from USM command
+            
+        Returns:
+            Dictionary of autoprovides resources
+        """
+        try:
+            # Try to parse as JSON first
+            if output.strip().startswith('{'):
+                return json.loads(output)
+            
+            # If not JSON, try to parse line by line
+            autoprovides = {}
+            
+            for line in output.strip().split('\n'):
+                line = line.strip()
+                if not line or line.startswith('#'):
+                    continue
+                
+                # Parse lines in format "resource-type:resource-name path"
+                if ':' in line:
+                    parts = line.split(maxsplit=1)
+                    if len(parts) >= 2:
+                        resource_ref = parts[0]
+                        path_info = parts[1].strip()
+                        
+                        # Convert to resource object or shorthand
+                        if path_info == "as-expected":
+                            autoprovides[resource_ref] = "as-expected"
+                        else:
+                            # Try to parse as path:base format
+                            if ':' in path_info:
+                                path_parts = path_info.split(':', 1)
+                                if len(path_parts) == 2:
+                                    path_base = path_parts[0]
+                                    path = path_parts[1]
+                                    
+                                    # Map path base strings to enum
+                                    base_mapping = {
+                                        "source": PathBase.SOURCE,
+                                        "build": PathBase.BUILD,
+                                        "install": PathBase.INSTALL,
+                                        "as-expected": PathBase.AS_EXPECTED
+                                    }
+                                    
+                                    if path_base in base_mapping:
+                                        autoprovides[resource_ref] = Resource(
+                                            path=path,
+                                            path_base=base_mapping[path_base],
+                                            type=FileType.REG
+                                        )
+                            else:
+                                # Just a path
+                                autoprovides[resource_ref] = Resource(
+                                    path=path_info,
+                                    path_base=PathBase.SOURCE,
+                                    type=FileType.REG
+                                )
+            
+            return autoprovides
+            
+        except Exception as e:
+            logger.warning(f"Failed to parse autoprovides output: {e}")
+            return {}
+
+    def validate_manifest(self, manifest_path: Path) -> List[str]:
+        """Validate a USM manifest using USM.
+        
+        Args:
+            manifest_path: Path to the manifest file
+            
+        Returns:
+            List of validation errors
+        """
+        try:
+            logger.info(f"Validating manifest {manifest_path} with USM")
+            
+            # Run usm manifest validate
+            cmd = [self.usm_command, "manifest", "validate", str(manifest_path)]
+            result = subprocess.run(
+                cmd,
+                capture_output=True,
+                text=True,
+                timeout=60
+            )
+            
+            if result.returncode == 0:
+                return []
+            
+            # Parse errors from output
+            errors = []
+            for line in result.stderr.split('\n'):
+                line = line.strip()
+                if line and not line.startswith('#'):
+                    errors.append(line)
+            
+            return errors
+            
+        except subprocess.TimeoutExpired:
+            logger.error("USM manifest validation timed out")
+            return ["Validation timed out"]
+        except subprocess.SubprocessError as e:
+            logger.error(f"USM manifest validation failed: {e}")
+            return [f"Validation failed: {e}"]
+        except Exception as e:
+            logger.error(f"Unexpected error in manifest validation: {e}")
+            return [f"Unexpected error: {e}"]
+
+    def get_package_info(self, package_name: str) -> Optional[Dict[str, Any]]:
+        """Get information about an installed package.
+        
+        Args:
+            package_name: Name of the package
+            
+        Returns:
+            Package information dictionary or None if not found
+        """
+        try:
+            # Run usm info
+            cmd = [self.usm_command, "info", package_name]
+            result = subprocess.run(
+                cmd,
+                capture_output=True,
+                text=True,
+                timeout=30
+            )
+            
+            if result.returncode != 0:
+                return None
+            
+            # Parse the output
+            return self._parse_package_info(result.stdout)
+            
+        except subprocess.TimeoutExpired:
+            logger.error(f"USM info for {package_name} timed out")
+            return None
+        except subprocess.SubprocessError as e:
+            logger.error(f"USM info for {package_name} failed: {e}")
+            return None
+        except Exception as e:
+            logger.error(f"Unexpected error getting package info: {e}")
+            return None
+
+    def _parse_package_info(self, output: str) -> Dict[str, Any]:
+        """Parse the output of "usm info".
+        
+        Args:
+            output: Output from USM command
+            
+        Returns:
+            Package information dictionary
+        """
+        info = {}
+        
+        for line in output.strip().split('\n'):
+            line = line.strip()
+            if not line or line.startswith('#'):
+                continue
+            
+            # Parse lines in format "key: value"
+            if ':' in line:
+                parts = line.split(':', 1)
+                if len(parts) >= 2:
+                    key = parts[0].strip()
+                    value = parts[1].strip()
+                    info[key] = value
+        
+        return info
+
+    def list_installed_packages(self) -> List[Dict[str, Any]]:
+        """List all installed packages.
+        
+        Returns:
+            List of package information dictionaries
+        """
+        try:
+            # Run usm list
+            cmd = [self.usm_command, "list"]
+            result = subprocess.run(
+                cmd,
+                capture_output=True,
+                text=True,
+                timeout=60
+            )
+            
+            if result.returncode != 0:
+                return []
+            
+            # Parse the output
+            return self._parse_package_list(result.stdout)
+            
+        except subprocess.TimeoutExpired:
+            logger.error("USM list timed out")
+            return []
+        except subprocess.SubprocessError as e:
+            logger.error(f"USM list failed: {e}")
+            return []
+        except Exception as e:
+            logger.error(f"Unexpected error listing packages: {e}")
+            return []
+
+    def _parse_package_list(self, output: str) -> List[Dict[str, Any]]:
+        """Parse the output of "usm list".
+        
+        Args:
+            output: Output from USM command
+            
+        Returns:
+            List of package information dictionaries
+        """
+        packages = []
+        
+        for line in output.strip().split('\n'):
+            line = line.strip()
+            if not line or line.startswith('#'):
+                continue
+            
+            # Try to parse as JSON first
+            if line.startswith('{'):
+                try:
+                    packages.append(json.loads(line))
+                    continue
+                except json.JSONDecodeError:
+                    pass
+            
+            # Parse as tab-separated or space-separated
+            parts = line.split()
+            if len(parts) >= 2:
+                packages.append({
+                    "name": parts[0],
+                    "version": parts[1] if len(parts) > 1 else "",
+                    "description": " ".join(parts[2:]) if len(parts) > 2 else ""
+                })
+        
+        return packages
+
+    def check_dependencies(self, package_names: List[str]) -> Dict[str, List[str]]:
+        """Check dependencies for packages.
+        
+        Args:
+            package_names: List of package names
+            
+        Returns:
+            Dictionary mapping package names to their dependencies
+        """
+        try:
+            # Run usm deps
+            cmd = [self.usm_command, "deps"] + package_names
+            result = subprocess.run(
+                cmd,
+                capture_output=True,
+                text=True,
+                timeout=60
+            )
+            
+            if result.returncode != 0:
+                return {name: [] for name in package_names}
+            
+            # Parse the output
+            return self._parse_dependencies(result.stdout, package_names)
+            
+        except subprocess.TimeoutExpired:
+            logger.error("USM deps timed out")
+            return {name: [] for name in package_names}
+        except subprocess.SubprocessError as e:
+            logger.error(f"USM deps failed: {e}")
+            return {name: [] for name in package_names}
+        except Exception as e:
+            logger.error(f"Unexpected error checking dependencies: {e}")
+            return {name: [] for name in package_names}
+
+    def _parse_dependencies(self, output: str, package_names: List[str]) -> Dict[str, List[str]]:
+        """Parse the output of "usm deps".
+        
+        Args:
+            output: Output from USM command
+            package_names: List of package names
+            
+        Returns:
+            Dictionary mapping package names to their dependencies
+        """
+        dependencies = {name: [] for name in package_names}
+        
+        current_package = None
+        
+        for line in output.strip().split('\n'):
+            line = line.strip()
+            if not line or line.startswith('#'):
+                continue
+            
+            # Check if this is a package name line
+            if line.endswith(':'):
+                current_package = line[:-1]
+                if current_package in dependencies:
+                    dependencies[current_package] = []
+            elif current_package and current_package in dependencies:
+                # This is a dependency
+                dependencies[current_package].append(line)
+        
+        return dependencies

+ 139 - 0
test_autusm.py

@@ -0,0 +1,139 @@
+#!/usr/bin/env python3
+"""
+Simple test script for autusm to verify the implementation.
+"""
+
+import sys
+import tempfile
+from pathlib import Path
+
+# Add the src directory to the path so we can import autusm
+sys.path.insert(0, str(Path(__file__).parent / "src"))
+
+from autusm.models import PackageInfo, BuildSystem, BuildSystemType, USMManifest
+from autusm.generator import ScriptGenerator
+from autusm.manifest import ManifestGenerator
+
+
+def test_models():
+    """Test the data models."""
+    print("Testing data models...")
+    
+    # Test PackageInfo
+    package_info = PackageInfo(
+        name="test-package",
+        version="1.0.0",
+        summary="A test package",
+        description="This is a test package for autusm"
+    )
+    
+    print(f"Created PackageInfo: {package_info.name} v{package_info.version}")
+    
+    # Test BuildSystem
+    build_system = BuildSystem(
+        type=BuildSystemType.CMAKE,
+        config_files=["CMakeLists.txt"],
+        build_files=["CMakeLists.txt"]
+    )
+    
+    print(f"Created BuildSystem: {build_system.type.value}")
+    
+    # Test USMManifest
+    manifest = USMManifest.from_package_info(package_info)
+    print(f"Created USMManifest: {manifest.name}")
+    
+    print("✓ Data models test passed\n")
+
+
+def test_script_generator():
+    """Test the script generator."""
+    print("Testing script generator...")
+    
+    package_info = PackageInfo(
+        name="test-package",
+        version="1.0.0",
+        summary="A test package"
+    )
+    
+    build_system = BuildSystem(
+        type=BuildSystemType.CMAKE,
+        config_files=["CMakeLists.txt"]
+    )
+    
+    generator = ScriptGenerator()
+    
+    with tempfile.TemporaryDirectory() as temp_dir:
+        output_dir = Path(temp_dir) / "scripts"
+        generator.generate_scripts(package_info, build_system, output_dir)
+        
+        # Check if scripts were created
+        acquire_script = output_dir / "acquire"
+        build_script = output_dir / "build"
+        install_script = output_dir / "install"
+        
+        if acquire_script.exists() and build_script.exists() and install_script.exists():
+            print("✓ Scripts generated successfully")
+        else:
+            print("✗ Script generation failed")
+    
+    print("✓ Script generator test passed\n")
+
+
+def test_manifest_generator():
+    """Test the manifest generator."""
+    print("Testing manifest generator...")
+    
+    package_info = PackageInfo(
+        name="test-package",
+        version="1.0.0",
+        summary="A test package",
+        description="This is a test package for autusm",
+        url="https://github.com/example/test-package"
+    )
+    
+    build_system = BuildSystem(
+        type=BuildSystemType.CMAKE,
+        config_files=["CMakeLists.txt"]
+    )
+    
+    generator = ManifestGenerator()
+    manifest = generator.generate(package_info, build_system)
+    
+    # Check if manifest was created correctly
+    if (manifest.name == "test-package" and 
+        manifest.version == "1.0.0" and 
+        manifest.summary == "A test package"):
+        print("✓ Manifest generated successfully")
+        
+        # Test JSON serialization
+        json_str = manifest.to_json()
+        if '"name": "test-package"' in json_str:
+            print("✓ JSON serialization works")
+        else:
+            print("✗ JSON serialization failed")
+    else:
+        print("✗ Manifest generation failed")
+    
+    print("✓ Manifest generator test passed\n")
+
+
+def main():
+    """Run all tests."""
+    print("Running autusm implementation tests...\n")
+    
+    try:
+        test_models()
+        test_script_generator()
+        test_manifest_generator()
+        
+        print("All tests passed! ✓")
+        return 0
+    except Exception as e:
+        print(f"Test failed: {e}")
+        import traceback
+        traceback.print_exc()
+        return 1
+
+
+if __name__ == "__main__":
+    sys.exit(main())