Qualcomm® AI Hub

目次

  • Qualcomm® AI Hub の概要
  • 例
  • APIドキュメント
    • API documentation
      • qai_hub.upload_dataset
      • qai_hub.upload_model
      • qai_hub.get_dataset
      • qai_hub.get_datasets
      • qai_hub.get_devices
      • qai_hub.get_device_attributes
      • qai_hub.get_frameworks
      • qai_hub.get_job
      • qai_hub.get_job_summaries
      • qai_hub.get_model
      • qai_hub.get_models
      • qai_hub.set_verbose
      • qai_hub.submit_compile_job
      • qai_hub.submit_quantize_job
      • qai_hub.submit_profile_job
      • qai_hub.submit_inference_job
      • qai_hub.submit_link_job
      • qai_hub.submit_compile_and_profile_jobs
      • qai_hub.submit_compile_and_quantize_jobs
      • Dataset
      • Device
      • Framework
      • Model
      • Job
      • JobSummary
      • CompileJobSummary
      • QuantizeJobSummary
      • ProfileJobSummary
      • InferenceJobSummary
      • LinkJobSummary
      • CompileJob
      • ProfileJob
      • QuantizeJob
      • InferenceJob
      • LinkJob
      • CompileJobResult
      • ProfileJobResult
      • QuantizeJobResult
      • InferenceJobResult
      • LinkJobResult
      • qai_hub.QuantizeDtype
      • JobStatus
      • qai_hub.JobType
      • qai_hub.SourceModelType
      • qai_hub.ModelMetadataKey
      • qai_hub.Error
      • qai_hub.UserError
      • qai_hub.InternalError
  • FAQ
  • リリースノート

追加のリソース

  • Qualcomm® AI Stack
  • Qualcomm® AI Hub Models
  • Qualcomm® AI Hub Apps
  • Qualcomm® AI Hub
  • お問い合わせ
Qualcomm® AI Hub
  • Qualcomm® AI Hub の概要
  • API documentation
  • qai_hub.submit_link_job

qai_hub.submit_link_job

submit_link_job(models, device=None, name=None, options='')

Submits a link job.

A link job generates a context binary model from one or more input models. The input models must either be a QNN DLC model, or a context binary model that was produced with qai_hub.submit_compile_job() with the deprecated --qnn_bin_conversion_via_model_library option. This is particularly useful if the input models contain overlapping weights, since the weights will be shared between the graphs.

To profile or inference a multi-graph QNN context binary, please use --qnn_options context_enable_graphs=<graph name> to select the graph.

パラメータ:
  • models (Union[Model, str, Path, None, List[Model | str | Path | None], list[Model]]) -- Models to link. Each model in the list must be a QNN DLC model on an AI Hub compiled QNN context binary model.

  • name (Optional[str]) -- Optional name for the job. Job names need not be unique.

  • options (str) -- Cli-like flag options. See Link Options.

戻り値:

job -- Returns the link job.

戻り値の型:

LinkJob

Previous Next

© Copyright <attribute 'year' of 'datetime.date' objects>, Qualcomm® Technologies, Inc.

Built with Sphinx using a theme provided by Read the Docs.
  • Terms of Use
  • Privacy Policy
  • Cookie Policy
  • Contact Us
  • Cookie Settings

© 2025 Qualcomm Technologies, Inc. and/or its affiliated companies.

References to "Qualcomm" may mean Qualcomm Incorporated, or subsidiaries or business units within the Qualcomm corporate structure, as applicable.

Qualcomm Incorporated includes our licensing business, QTL, and the vast majority of our patent portfolio. Qualcomm Technologies, Inc., a subsidiary of Qualcomm Incorporated, operates, along with its subsidiaries, substantially all of our engineering, research and development functions, and substantially all of our products and services businesses, including our QCT semiconductor business.

Materials that are as of a specific date, including but not limited to press releases, presentations, blog posts and webcasts, may have been superseded by subsequent events or disclosures.

Nothing in these materials is an offer to sell any of the components or devices referenced herein.

言語: 日本語 
言語
English
한국어
日本語
繁體中文 (台灣)