Skip to content

Conversation

litongjava
Copy link
Contributor

@litongjava litongjava commented Aug 1, 2025

1.conver java-api to maven project
2.pushlish java-api to maven central

Summary by CodeRabbit

  • New Features

    • Added a user guide in both English and Chinese for the Sherpa-ONNX Java API, including setup instructions, usage examples, and troubleshooting tips.
    • Introduced a Maven project descriptor file for streamlined Java project management.
    • Added a build configuration file supporting Windows, Linux, and macOS.
    • Introduced a utility to automatically load native libraries based on the user's platform.
    • Added a new interface defining native library filenames for different operating systems.
  • Documentation

    • Comprehensive guides now available in English and Chinese for end-users.
    • Added instructions for downloading and preparing JNI native libraries for multiple platforms.
    • Added README detailing JNI library downloads and folder structure.
  • Chores

    • Updated the .gitignore file to cover a wide range of development environments and tools.
    • Adjusted the Makefile to use standard Java source directory layout.
    • Added .gitignore and README files for JNI native library resources.
  • Style

    • Minor formatting and whitespace improvements across several Java source files.
    • Corrected a typo in a Java API setter method name.
  • Bug Fixes

    • Added validation checks to ensure valid configuration values in the Java API builder.
    • Improved resource management by centralizing native resource cleanup methods.
    • Added input validation to prevent invalid method arguments.

Copy link

coderabbitai bot commented Aug 1, 2025

Warning

Rate limit exceeded

@litongjava has exceeded the limit for the number of commits or files that can be reviewed per hour. Please wait 0 minutes and 42 seconds before requesting another review.

⌛ How to resolve this issue?

After the wait time has elapsed, a review can be triggered using the @coderabbitai review command as a PR comment. Alternatively, push new commits to this PR.

We recommend that you space out your commits to avoid hitting the rate limit.

🚦 How do rate limits work?

CodeRabbit enforces hourly rate limits for each developer per organization.

Our paid plans have higher rate limits than the trial, open-source and free plans. In all cases, we re-allow further reviews after a brief timeout.

Please see our FAQ for further information.

📥 Commits

Reviewing files that changed from the base of the PR and between fcf7387 and c5a7d15.

📒 Files selected for processing (2)
  • sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/utils/LibraryUtils.java (1 hunks)
  • sherpa-onnx/java-api/src/main/resources/readme.md (1 hunks)

Walkthrough

This update introduces several new configuration and documentation files to the sherpa-onnx/java-api directory, including build scripts, a Maven project descriptor, and multilingual user guides. Additionally, the Makefile's source path is adjusted for standard Java project structure. A minor typo fix is applied in one Java source file. The FeatureConfig.Builder.build() and OnlineRecognizerConfig.Builder.build() methods are improved by adding validation checks for field values. Several Java source files are enhanced with input validations and resource management refactoring. A new utility class for native library loading is added, and the native library loading logic is updated accordingly. All other Java source files are reformatted without functional changes.

Changes

Cohort / File(s) Change Summary
Build and Configuration Files
sherpa-onnx/java-api/.build.txt, sherpa-onnx/java-api/pom.xml, sherpa-onnx/java-api/Makefile
Added a new build configuration file defining platform-specific Maven build commands and environment variables. Introduced a Maven pom.xml with project metadata, Java 1.8 compiler settings, license, developer info, and publishing plugins. Updated the Makefile to use the standard Maven/Gradle Java source directory layout (src/main/java). No changes to Java source code logic.
Documentation
sherpa-onnx/java-api/readme.md, sherpa-onnx/java-api/readme.zh.md, sherpa-onnx/java-api/src/main/resources/readme.md
Added comprehensive user guides in English and Chinese covering setup prerequisites, Maven dependency inclusion, native JNI library configuration for multiple platforms, Kokoro TTS model preparation, detailed Java example usage, and troubleshooting tips for the Sherpa-ONNX Java API. Added a README in resources with instructions for downloading and placing JNI libraries for various platforms and architectures.
Ignore Rules
sherpa-onnx/java-api/.gitignore, sherpa-onnx/java-api/src/main/resources/.gitignore
Replaced a minimal .gitignore with an extensive, consolidated template covering ignore patterns for various IDEs (Eclipse, JetBrains), build tools (Java, Leiningen, Gradle), languages (Python, Node.js), OS-specific files, and exceptions like the Gradle wrapper jar. Added a .gitignore in resources to ignore platform-specific JNI library directories.
Java Source Files – Typo Fix
sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/EndpointConfig.java
Fixed a typo in the Builder class by renaming the setter method from setRul3 to setRule3.
Java Source Files – Builder Validation Added
sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/FeatureConfig.java, .../OnlineRecognizerConfig.java
Enhanced builder methods with validation: FeatureConfig.Builder.build() now throws IllegalArgumentException if sampleRate or featureDim ≤ 0, or dither < 0. OnlineRecognizerConfig.Builder.build() validates positive maxActivePaths and non-negative hotwordsScore and blankPenalty.
Java Source Files – Resource Management Refactoring and Input Validation
sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/OnlineRecognizer.java, .../OnlineStream.java
Refactored resource cleanup in OnlineRecognizer and OnlineStream classes by introducing a close() method encapsulating native resource deletion, called by release() and finalize(). Added input validation in OnlineRecognizer.decode(OnlineStream[]) to throw IllegalArgumentException if input is null or empty. No other logic or control flow changes.
Java Source Files – Native Library Loading Refactor
sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/LibraryLoader.java, .../utils/LibraryUtils.java, .../core/Core.java
Modified LibraryLoader.loadLibrary() to delegate native library loading to the new utility class LibraryUtils.load(). Added LibraryUtils class which detects OS and architecture, extracts the appropriate native library from resources to a user directory, and loads it via System.load(). Added Core interface defining native library filenames for Windows, Unix, and macOS. This refactor centralizes and enhances native library loading with platform-specific support and resource extraction.
Java Source Files – Whitespace/Formatting Only
sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/EndpointRule.java, .../OnlineLMConfig.java, .../OnlineModelConfig.java, .../OnlineParaformerModelConfig.java, .../OnlineTransducerModelConfig.java
Reformatted these Java source files with whitespace and formatting adjustments only. No changes to logic, control flow, error handling, or public interfaces.

Sequence Diagram(s)

Not applicable: Changes primarily involve configuration, documentation, validation additions, resource management refactoring, and native library loading utility introduction without new control flow or feature interactions.

Estimated code review effort

🎯 3 (Moderate) | ⏱️ ~20 minutes

Poem

A bunny hops with careful care,
Fixing typos here and there.
Guides in two tongues now appear,
Build scripts set and paths made clear.
Validation guards the build’s way,
Native loads now smart and fey.
Resources close with gentle flair,
This rabbit’s work is everywhere! 🐇✨

✨ Finishing Touches
  • 📝 Generate Docstrings
🧪 Generate unit tests
  • Create PR with unit tests
  • Post copyable unit tests in a comment

Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out.

❤️ Share
🪧 Tips

Chat

There are 3 ways to chat with CodeRabbit:

  • Review comments: Directly reply to a review comment made by CodeRabbit. Example:
    • I pushed a fix in commit <commit_id>, please review it.
    • Explain this complex logic.
    • Open a follow-up GitHub issue for this discussion.
  • Files and specific lines of code (under the "Files changed" tab): Tag @coderabbitai in a new review comment at the desired location with your query. Examples:
    • @coderabbitai explain this code block.
  • PR comments: Tag @coderabbitai in a new PR comment to ask questions about the PR branch. For the best results, please provide a very specific query, as very limited context is provided in this mode. Examples:
    • @coderabbitai gather interesting stats about this repository and render them as a table. Additionally, render a pie chart showing the language distribution in the codebase.
    • @coderabbitai read src/utils.ts and explain its main purpose.
    • @coderabbitai read the files in the src/scheduler package and generate a class diagram using mermaid and a README in the markdown format.

Support

Need help? Create a ticket on our support page for assistance with any issues or questions.

CodeRabbit Commands (Invoked using PR comments)

  • @coderabbitai pause to pause the reviews on a PR.
  • @coderabbitai resume to resume the paused reviews.
  • @coderabbitai review to trigger an incremental review. This is useful when automatic reviews are disabled for the repository.
  • @coderabbitai full review to do a full review from scratch and review all the files again.
  • @coderabbitai summary to regenerate the summary of the PR.
  • @coderabbitai generate docstrings to generate docstrings for this PR.
  • @coderabbitai generate sequence diagram to generate a sequence diagram of the changes in this PR.
  • @coderabbitai generate unit tests to generate unit tests for this PR.
  • @coderabbitai resolve resolve all the CodeRabbit review comments.
  • @coderabbitai configuration to show the current CodeRabbit configuration for the repository.
  • @coderabbitai help to get help.

Other keywords and placeholders

  • Add @coderabbitai ignore anywhere in the PR description to prevent this PR from being reviewed.
  • Add @coderabbitai summary to generate the high-level summary at a specific location in the PR description.
  • Add @coderabbitai anywhere in the PR title to generate the title automatically.

CodeRabbit Configuration File (.coderabbit.yaml)

  • You can programmatically configure CodeRabbit by adding a .coderabbit.yaml file to the root of your repository.
  • Please see the configuration documentation for more information.
  • If your editor has YAML language server enabled, you can add the path at the top of this file to enable auto-completion and validation: # yaml-language-server: $schema=https://coderabbit.ai/integrations/schema.v2.json

Documentation and Community

  • Visit our Documentation for detailed information on how to use CodeRabbit.
  • Join our Discord Community to get help, request features, and share feedback.
  • Follow us on X/Twitter for updates and announcements.

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 11

🔭 Outside diff range comments (1)
sherpa-onnx/java-api/Makefile (1)

121-124: Rule misses nested package dirs

Pattern rule assumes all .java live exactly one level below $(package_dir); nested sub-packages (e.g., util/Helper.java) won’t compile.
Use javac’s ability to ingest source file list instead:

$(out_jar): $(java_files)
	@mkdir -p $(out_dir)
	javac -d $(out_dir) $(java_files)
	jar cvf $@ -C $(out_dir) .
🧹 Nitpick comments (20)
sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/EndpointRule.java (2)

30-32: Rename boolean getter to follow JavaBean convention

Using getMustContainNonSilence() for a boolean slightly deviates from the common is* prefix convention (isMustContainNonSilence()).
Renaming improves interoperability with frameworks/tools that rely on the JavaBean spec for property introspection.


39-41: Add basic value validation in Builder.build()

minTrailingSilence and minUtteranceLength logically cannot be negative. Enforcing this early prevents invalid objects from being instantiated.

     public EndpointRule build() {
-        return new EndpointRule(this);
+        if (minTrailingSilence < 0 || minUtteranceLength < 0) {
+            throw new IllegalArgumentException(
+                "minTrailingSilence and minUtteranceLength must be non-negative");
+        }
+        return new EndpointRule(this);
     }
sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/OnlineLMConfig.java (1)

32-44: Consider adding basic argument validation inside Builder.build()
Right now any value is accepted, including an empty model string or a non-positive scale, both of which will likely yield runtime errors further downstream. A couple of cheap null / range checks here would fail-fast and surface misconfiguration closer to the call site.

public OnlineLMConfig build() {
-    return new OnlineLMConfig(this);
+    // Fail-fast validation
+    if (model == null || model.isEmpty()) {
+        throw new IllegalArgumentException("model path must be non-empty");
+    }
+    if (scale <= 0) {
+        throw new IllegalArgumentException("scale must be positive");
+    }
+    return new OnlineLMConfig(this);
}

Verify that similar config builders across the codebase perform (or intentionally omit) such checks for consistency.

sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/OnlineTransducerModelConfig.java (2)

34-37: Prefer null default over empty string or require explicit set

Using "" as a sentinel value makes it impossible to distinguish “unset” from “set to empty”, and forces the check above. Consider:

-        private String encoder = "";
-        private String decoder = "";
-        private String joiner = "";
+        private String encoder;
+        private String decoder;
+        private String joiner;

(Combined with the validation in build() this keeps the API safe and intention-revealing.)


6-32: Consider overriding toString(), equals(), hashCode()

OnlineTransducerModelConfig will likely be logged, compared, or stored in collections.
Overriding these methods improves debuggability and correct behaviour in hash-based containers.
Simple implementation via Objects.hash() is enough.

sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/FeatureConfig.java (2)

6-16: Mark the configuration class final to guarantee immutability

FeatureConfig is designed to be immutable, yet it is still inheritable. Making the class final (or providing a private constructor plus a final class) prevents accidental subclassing that could break invariants or introduce mutability.

-public class FeatureConfig {
+public final class FeatureConfig {

34-37: Expose default constants instead of repeating literals

The defaults (16000, 80, 0.0f) are duplicated in tests or downstream code more easily if surfaced as public static final constants. This avoids “magic numbers” and helps keep the defaults in a single place.

     public static class Builder {
-        private int sampleRate = 16000;
-        private int featureDim = 80;
-        private float dither = 0.0f;
+        public static final int DEFAULT_SAMPLE_RATE = 16000;
+        public static final int DEFAULT_FEATURE_DIM  = 80;
+        public static final float DEFAULT_DITHER     = 0.0f;
+
+        private int sampleRate = DEFAULT_SAMPLE_RATE;
+        private int featureDim = DEFAULT_FEATURE_DIM;
+        private float dither   = DEFAULT_DITHER;
sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/OnlineParaformerModelConfig.java (2)

27-44: Builder could expose clearer “withX” methods & stronger typing

  1. Naming the mutators setEncoder / setDecoder inside a builder is slightly misleading—withEncoder / withDecoder (or encoder() / decoder()) is the common convention.
  2. These are filesystem paths; consider java.nio.file.Path instead of raw String to gain compile-time safety and platform independence.

No functional change is required, but adopting these adjustments will improve API ergonomics and reduce misuse.


6-26: Override toString, equals, and hashCode for better debuggability & collection support
This config object is immutable and will likely be logged, compared, or stored in collections. Implementing the standard trio will aid diagnostics and prevent subtle bugs when used as map keys or set elements.

 public class OnlineParaformerModelConfig {
     private final String encoder;
     private final String decoder;

+    @Override
+    public String toString() {
+        return "OnlineParaformerModelConfig{encoder='" + encoder + "', decoder='" + decoder + "'}";
+    }
+
+    @Override
+    public boolean equals(Object o) {
+        if (this == o) return true;
+        if (!(o instanceof OnlineParaformerModelConfig)) return false;
+        OnlineParaformerModelConfig that = (OnlineParaformerModelConfig) o;
+        return encoder.equals(that.encoder) && decoder.equals(that.decoder);
+    }
+
+    @Override
+    public int hashCode() {
+        return java.util.Objects.hash(encoder, decoder);
+    }
sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/OnlineRecognizerConfig.java (2)

7-22: Expose the remaining configuration fields or document why they are intentionally hidden

Only getModelConfig() is public, while callers have no accessor for the other 11 immutable fields.
Unless the intent is to keep those details internal, provide getters (or toString()/equals/hashCode) so that downstream code can inspect, log, or compare full configs.

+    public FeatureConfig getFeatConfig()              { return featConfig; }
+    public OnlineLMConfig getLmConfig()               { return lmConfig; }
+    public OnlineCtcFstDecoderConfig getCtcFstDecoderConfig() { return ctcFstDecoderConfig; }
+    public EndpointConfig getEndpointConfig()         { return endpointConfig; }
+    public HomophoneReplacerConfig getHr()            { return hr; }
+    public boolean isEnableEndpoint()                 { return enableEndpoint; }
+    public String getDecodingMethod()                 { return decodingMethod; }
+    public int getMaxActivePaths()                    { return maxActivePaths; }
+    public String getHotwordsFile()                   { return hotwordsFile; }
+    public float getHotwordsScore()                   { return hotwordsScore; }
+    public String getRuleFsts()                       { return ruleFsts; }
+    public String getRuleFars()                       { return ruleFars; }
+    public float getBlankPenalty()                    { return blankPenalty; }

93-97: Rename hr to a clearer field name

hr is cryptic compared to the otherwise descriptive field names.
Consider homophoneReplacerConfig to improve readability.

-        private HomophoneReplacerConfig hr = HomophoneReplacerConfig.builder().build();
+        private HomophoneReplacerConfig homophoneReplacerConfig = HomophoneReplacerConfig.builder().build();

(and adjust usages accordingly)

sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/OnlineRecognizer.java (2)

7-7: Clarify thread-safety of the native pointer
ptr is mutated by release() and read by every public method, yet it is not volatile and no synchronization is used. If an OnlineRecognizer instance is accessed from multiple threads (for example, decoding while another thread closes it), visibility issues or use-after-free native crashes become possible.
Consider either documenting single-threaded usage, guarding all accesses with a lock, or making ptr volatile and using an atomic “closed” flag.


57-63: Avoid unchecked casts from raw Object[]
Returning a raw Object[] from JNI forces multiple unchecked casts and hides type mismatches until runtime. Wrap the JNI layer to return a dedicated DTO or unpack in native code and return a strongly-typed Java object directly.

sherpa-onnx/java-api/.build.txt (1)

1-17: Hard-coded JDK paths & ancient 8u121 limit portability

Hard-wiring JAVA_HOME to …/jdk1.8.0_121 means the script will break on systems without that exact path and prevents use of a patched JDK.
Suggest:

-[win.env]
-set JAVA_HOME=D:\java\jdk1.8.0_121
+[win.env]
+REM Override if JAVA_HOME already present
+IF NOT DEFINED JAVA_HOME (
+  set "JAVA_HOME=%ProgramFiles%\Java\jdk1.8.0_**latest**"
+)

Repeat for Linux/macOS (${JAVA_HOME:-/usr/lib/jvm/java-1.8-openjdk} etc.).
Also drop -DskipTests once unit tests land and remove -Dgpg.skip=true in CI so artifacts are signed for Central.

sherpa-onnx/java-api/Makefile (1)

7-105: Manual file list is brittle – generate it

Maintaining ~100 filenames by hand is error-prone and new classes will silently be excluded.

-# Hard-coded list …
-java_files := LibraryLoader.java
-
+# Automatically collect all .java under src/main/java
+java_files := $(shell find src/main/java/$(package_dir) -name '*.java')

This eliminates duplicate bookkeeping and matches the Maven layout you just adopted.

sherpa-onnx/java-api/.gitignore (1)

5-5: Eliminate duplicate ignore entries

classes/, local.properties, and build/ each appear twice. Duplicates add visual clutter and make future edits harder to reason about.

-classes/
 ...
-local.properties
 ...
-build/

Remove the second occurrence of each to keep the file concise.

Also applies to: 117-117, 12-12, 140-140, 93-93, 144-144

sherpa-onnx/java-api/readme.md (4)

3-3: Use a proper heading rather than italic text

*Applicable to Windows / macOS / Linux ...* is meant as a section title, not emphasis.
Replace with a heading for better structure and to satisfy markdown-lint MD036.

-*Applicable to Windows / macOS / Linux (Windows used as example for dynamic library loading)*
+### Applicability  
+Windows / macOS / Linux  
+(Windows is used below when showing dynamic-library examples)

61-63: Add a language specifier to fenced block

This code block triggers markdown-lint MD040. Prefix it with sh (or bat on Windows) so tooling knows how to highlight it.

-```
+```sh
-Djava.library.path=path\to\directory\containing\sherpa-onnx-jni.dll
+java -Djava.library.path=path\to\directory\containing\sherpa-onnx-jni.dll \
+     -jar your-app.jar

97-100: Specify language for bare-URL code block

-```
+```text
 https://k2-fsa.github.io/sherpa/onnx/tts/pretrained_models/kokoro.html

Prevents MD040 and renders consistently across viewers.


174-177: Follow Java naming conventions

real_time_factor uses snake_case, which is uncommon in Java. Consider camelCase for consistency with the rest of the snippet.

-float real_time_factor = timeElapsedSeconds / audioDuration;
+float realTimeFactor = timeElapsedSeconds / audioDuration;
 ...
-System.out.printf("-- real-time factor (RTF): %.3f\n", real_time_factor);
+System.out.printf("-- real-time factor (RTF): %.3f\n", realTimeFactor);
📜 Review details

Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between ee9bd25 and ff9b392.

📒 Files selected for processing (16)
  • sherpa-onnx/java-api/.build.txt (1 hunks)
  • sherpa-onnx/java-api/.gitignore (1 hunks)
  • sherpa-onnx/java-api/Makefile (2 hunks)
  • sherpa-onnx/java-api/pom.xml (1 hunks)
  • sherpa-onnx/java-api/readme.md (1 hunks)
  • sherpa-onnx/java-api/readme.zh.md (1 hunks)
  • sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/EndpointConfig.java (1 hunks)
  • sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/EndpointRule.java (1 hunks)
  • sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/FeatureConfig.java (1 hunks)
  • sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/OnlineLMConfig.java (1 hunks)
  • sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/OnlineModelConfig.java (1 hunks)
  • sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/OnlineParaformerModelConfig.java (1 hunks)
  • sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/OnlineRecognizer.java (1 hunks)
  • sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/OnlineRecognizerConfig.java (1 hunks)
  • sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/OnlineStream.java (1 hunks)
  • sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/OnlineTransducerModelConfig.java (1 hunks)
🧰 Additional context used
🧬 Code Graph Analysis (2)
sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/OnlineRecognizer.java (1)
sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/LibraryLoader.java (1)
  • LibraryLoader (3-23)
sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/OnlineStream.java (1)
sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/LibraryLoader.java (1)
  • LibraryLoader (3-23)
🪛 markdownlint-cli2 (0.17.2)
sherpa-onnx/java-api/readme.md

3-3: Emphasis used instead of a heading

(MD036, no-emphasis-as-heading)


60-60: Fenced code blocks should have a language specified

(MD040, fenced-code-language)


92-92: Fenced code blocks should have a language specified

(MD040, fenced-code-language)


110-110: Fenced code blocks should have a language specified

(MD040, fenced-code-language)


190-190: Fenced code blocks should have a language specified

(MD040, fenced-code-language)

🔇 Additional comments (3)
sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/OnlineStream.java (1)

9-16: Library may remain unloaded for the long ptr-based ctor

The no-arg constructor ensures the JNI library is loaded (LibraryLoader.maybeLoad()), but the (long ptr) constructor omits the call.
If user code instantiates OnlineStream directly with a native pointer before anything else has triggered LibraryLoader, UnsatisfiedLinkError will occur when the first native method is invoked.

Either always call LibraryLoader.maybeLoad() in a static block, or invoke it from the second constructor as well.

sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/OnlineModelConfig.java (1)

6-152: No functional changes detected – looks good

The class remains a straightforward immutable DTO + builder; no behavioural regressions introduced.

sherpa-onnx/java-api/readme.zh.md (1)

1-200: LGTM – valuable, detailed Chinese guide

Clear instructions, examples, and troubleshooting steps — great addition.

Comment on lines 1 to 12
<project xmlns="http://maven.apache.org/POM/4.0.0" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" xsi:schemaLocation="http://maven.apache.org/POM/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
<modelVersion>4.0.0</modelVersion>
<groupId>com.litongjava</groupId>
<artifactId>sherpa-onnx-java-api</artifactId>
<version>1.0.0</version>
<properties>
<project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
<java.version>1.8</java.version>
<maven.compiler.source>${java.version}</maven.compiler.source>
<maven.compiler.target>${java.version}</maven.compiler.target>
</properties>
</project>
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Blocking: POM is not Maven-Central compliant & groupId diverges from package namespace

To deploy to Central you must supply the mandatory metadata sections (<name>, <description>, <url>, <licenses>, <scm>, <developers>).
Additionally, the Java sources live under com.k2fsa.*, while the artifact is published under com.litongjava; this mismatch will confuse consumers and clashes with the reverse-DNS convention.

   <artifactId>sherpa-onnx-java-api</artifactId>
   <version>1.0.0</version>
+  <name>Sherpa-ONNX Java API</name>
+  <description>Java bindings for Sherpa-ONNX streaming/offline ASR & TTS</description>
+  <url>https://github.com/k2-fsa/sherpa-onnx</url>
+
+  <licenses>
+    <license>
+      <name>Apache License, Version 2.0</name>
+      <url>https://www.apache.org/licenses/LICENSE-2.0</url>
+    </license>
+  </licenses>
+
+  <scm>
+    <connection>scm:git:git://github.com/k2-fsa/sherpa-onnx.git</connection>
+    <developerConnection>scm:git:ssh://github.com:k2-fsa/sherpa-onnx.git</developerConnection>
+    <url>https://github.com/k2-fsa/sherpa-onnx</url>
+  </scm>
+
+  <developers>
+    <developer>
+      <id>litongjava</id>
+      <name>Li Tong</name>
+    </developer>
+  </developers>

Consider changing <groupId> to com.k2fsa (or relocating the packages) so the coordinates and byte-code agree.

Failing to add the sections above will make OSSRH staging reject the deployment.

🤖 Prompt for AI Agents
In sherpa-onnx/java-api/pom.xml lines 1 to 12, the POM lacks mandatory metadata
sections required for Maven Central deployment and has a groupId that does not
match the Java package namespace. Add the required sections: <name>,
<description>, <url>, <licenses>, <scm>, and <developers> with appropriate
project information. Also, update the <groupId> to com.k2fsa to align with the
Java package namespace or alternatively refactor the Java packages to match the
current groupId. This will ensure compliance with Maven Central requirements and
avoid deployment rejection.

Comment on lines 48 to 55
// You'd better call it manually if it is not used anymore
public void release() {
if (this.ptr == 0) {
return;
}
delete(this.ptr);
this.ptr = 0;
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Rename release() to close() and make it idempotent-safe
Once AutoCloseable is implemented (see previous comment) the canonical resource-release method should be close(). Keep the current body but delegate from release() for back-compatibility:

+    @Deprecated
+    public void release() {
+        close();
+    }
+
     @Override
-    public void release() {
+    public void close() {
         if (this.ptr == 0) {
             return;
         }
         delete(this.ptr);
         this.ptr = 0;
     }

Committable suggestion skipped: line range outside the PR's diff.

🤖 Prompt for AI Agents
In
sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/OnlineRecognizer.java
around lines 48 to 55, rename the method release() to close() to follow the
AutoCloseable convention. Keep the existing logic inside close() and modify
release() to simply call close() for backward compatibility, ensuring the
resource release method is idempotent-safe and consistent with Java standards.

Comment on lines 48 to 63
public static class Builder {
private FeatureConfig featConfig = FeatureConfig.builder().build();
private OnlineModelConfig modelConfig = OnlineModelConfig.builder().build();
private OnlineLMConfig lmConfig = OnlineLMConfig.builder().build();
private OnlineCtcFstDecoderConfig ctcFstDecoderConfig = OnlineCtcFstDecoderConfig.builder().build();
private EndpointConfig endpointConfig = EndpointConfig.builder().build();
private HomophoneReplacerConfig hr = HomophoneReplacerConfig.builder().build();
private boolean enableEndpoint = true;
private String decodingMethod = "greedy_search";
private int maxActivePaths = 4;
private String hotwordsFile = "";
private float hotwordsScore = 1.5f;
private String ruleFsts = "";
private String ruleFars = "";
private float blankPenalty = 0.0f;

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Add basic validation inside the builder before constructing an instance

Currently any null, negative, or otherwise invalid value is silently accepted.
Recommend validating at least:

  • decodingMethod not null/blank
  • maxActivePaths > 0
  • hotwordsScore >= 1.0 (or documented lower-bound)
  • blankPenalty >= 0
         public OnlineRecognizerConfig build() {
-            return new OnlineRecognizerConfig(this);
+            Objects.requireNonNull(decodingMethod, "decodingMethod");
+            if (maxActivePaths <= 0) {
+                throw new IllegalArgumentException("maxActivePaths must be > 0");
+            }
+            if (hotwordsScore < 0) {
+                throw new IllegalArgumentException("hotwordsScore must be >= 0");
+            }
+            if (blankPenalty < 0) {
+                throw new IllegalArgumentException("blankPenalty must be >= 0");
+            }
+            return new OnlineRecognizerConfig(this);
         }
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
public static class Builder {
private FeatureConfig featConfig = FeatureConfig.builder().build();
private OnlineModelConfig modelConfig = OnlineModelConfig.builder().build();
private OnlineLMConfig lmConfig = OnlineLMConfig.builder().build();
private OnlineCtcFstDecoderConfig ctcFstDecoderConfig = OnlineCtcFstDecoderConfig.builder().build();
private EndpointConfig endpointConfig = EndpointConfig.builder().build();
private HomophoneReplacerConfig hr = HomophoneReplacerConfig.builder().build();
private boolean enableEndpoint = true;
private String decodingMethod = "greedy_search";
private int maxActivePaths = 4;
private String hotwordsFile = "";
private float hotwordsScore = 1.5f;
private String ruleFsts = "";
private String ruleFars = "";
private float blankPenalty = 0.0f;
public OnlineRecognizerConfig build() {
Objects.requireNonNull(decodingMethod, "decodingMethod");
if (maxActivePaths <= 0) {
throw new IllegalArgumentException("maxActivePaths must be > 0");
}
if (hotwordsScore < 0) {
throw new IllegalArgumentException("hotwordsScore must be >= 0");
}
if (blankPenalty < 0) {
throw new IllegalArgumentException("blankPenalty must be >= 0");
}
return new OnlineRecognizerConfig(this);
}
🤖 Prompt for AI Agents
In
sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/OnlineRecognizerConfig.java
between lines 48 and 63, the Builder class lacks validation for its fields
before building an instance. Add validation checks in the build() method or
before returning the constructed object to ensure decodingMethod is not null or
blank, maxActivePaths is greater than 0, hotwordsScore is at least 1.0, and
blankPenalty is non-negative. If any validation fails, throw an
IllegalArgumentException with a clear message to prevent invalid configurations.

Comment on lines 34 to 41
public void release() {
// stream object must be release after used
if (this.ptr == 0) {
return;
}
delete(this.ptr);
this.ptr = 0;
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

release() is not thread-safe – possible double free of native resource

release() and finalize() can be executed concurrently from different threads.
Since ptr is a plain long, the following race is possible:

  1. Thread-A calls release(), passes the if (ptr == 0) guard, still sees ptr != 0.
  2. The GC finalizer thread calls finalize() almost simultaneously, also passes the guard.
  3. Both threads call the native delete(ptr) – double free / corruption.

Guard the critical section with synchronization or an AtomicLong, and/or make release() idempotent using compareAndSet.

🤖 Prompt for AI Agents
In sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/OnlineStream.java
around lines 34 to 41, the release() method is not thread-safe and can cause a
double free of the native resource if called concurrently with finalize(). To
fix this, synchronize the critical section in release() or use an AtomicLong for
ptr and implement a compareAndSet operation to ensure only one thread can delete
the native resource and set ptr to 0, making release() idempotent and
thread-safe.

Comment on lines 43 to 47
@Override
protected void finalize() throws Throwable {
release();
super.finalize();
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

finalize() is deprecated – migrate to AutoCloseable

finalize() has been deprecated since Java 9 and removed in recent JDKs. Relying on it delays resource release, hurts GC performance, and will break on newer runtimes.

Implement AutoCloseable and encourage try-with-resources instead:

-public class OnlineStream {
+public class OnlineStream implements AutoCloseable {
 ...
-    public void release() {
+    @Override
+    public void close() {
         if (this.ptr == 0) {
             return;
         }
         delete(this.ptr);
         this.ptr = 0;
     }
 
-    @Override
-    protected void finalize() throws Throwable {
-        release();
-        super.finalize();
-    }
+    /* `finalize` removed – rely on explicit `close()` */

Clients can then use:

try (OnlineStream stream = recognizer.createStream()) {
    ...
}
🤖 Prompt for AI Agents
In sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/OnlineStream.java
around lines 43 to 47, the finalize() method is deprecated and should be
removed. Instead, make the OnlineStream class implement AutoCloseable, rename
the release() method to close(), and ensure close() properly releases resources.
This allows clients to use try-with-resources for automatic and timely resource
management.

@litongjava
Copy link
Contributor Author

上面的警告是之前代码中就有的

@csukuangfj
Copy link
Collaborator

感谢贡献!

目前这个PR的实现,需要用户做大量的工作。请设法把不同平台的库,放在包里面,代码自动加载不同平台的库,方便用户使用。

用户不需要手动去下载不同平台的库并设置环境变量。不然,现在的方法就够用了,没必要用maven或者其他包管理工具

@csukuangfj
Copy link
Collaborator

可以参考onnxruntime是如何做的

https://onnxruntime.ai/docs/get-started/with-java.html

@litongjava
Copy link
Contributor Author

litongjava commented Aug 2, 2025

1.没有问题,我来做
2.我将代码也顺便修复了
3.支持
windows (x64, x86, arm64)
Linux (x64, arm64)
macOS (x64, arm64)
4.打包体积会变大,请知悉

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

♻️ Duplicate comments (1)
sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/FeatureConfig.java (1)

38-54: Add input validation to prevent invalid configurations

The builder methods don't validate input values, allowing negative or zero values that could cause downstream failures. Consider adding validation either in the setters or in the build() method.

Add validation in the build() method:

     public FeatureConfig build() {
+        if (sampleRate <= 0) {
+            throw new IllegalArgumentException("sampleRate must be positive, got: " + sampleRate);
+        }
+        if (featureDim <= 0) {
+            throw new IllegalArgumentException("featureDim must be positive, got: " + featureDim);
+        }
+        if (dither < 0f) {
+            throw new IllegalArgumentException("dither must be non-negative, got: " + dither);
+        }
-        return new FeatureConfig(new Builder());
+        return new FeatureConfig(this);
     }
📜 Review details

Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between becf010 and 8158183.

📒 Files selected for processing (1)
  • sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/FeatureConfig.java (1 hunks)

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

🧹 Nitpick comments (1)
sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/utils/LibraryUtils.java (1)

36-42: Improve architecture detection robustness.

The current architecture detection using contains("64") is too broad and may incorrectly classify some architectures. Consider using more specific patterns.

-      if (osArch.contains("aarch64") || osArch.contains("arm")) {
+      if (osArch.contains("aarch64") || osArch.contains("arm64") || osArch.contains("arm")) {
         archName = WIN_ARM64;
-      } else if (osArch.contains("64")) {
+      } else if (osArch.contains("x86_64") || osArch.contains("amd64")) {
         archName = WIN_X64;
       } else {
         archName = WIN_X86;
       }

Apply similar improvements to the Linux and macOS detection blocks.

📜 Review details

Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 16b6dee and e193d40.

📒 Files selected for processing (11)
  • sherpa-onnx/java-api/pom.xml (1 hunks)
  • sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/LibraryLoader.java (1 hunks)
  • sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/OnlineParaformerModelConfig.java (1 hunks)
  • sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/OnlineRecognizer.java (1 hunks)
  • sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/OnlineRecognizerConfig.java (1 hunks)
  • sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/OnlineStream.java (1 hunks)
  • sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/OnlineTransducerModelConfig.java (1 hunks)
  • sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/core/Core.java (1 hunks)
  • sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/utils/LibraryUtils.java (1 hunks)
  • sherpa-onnx/java-api/src/main/resources/.gitignore (1 hunks)
  • sherpa-onnx/java-api/src/main/resources/readme.md (1 hunks)
✅ Files skipped from review due to trivial changes (4)
  • sherpa-onnx/java-api/src/main/resources/.gitignore
  • sherpa-onnx/java-api/src/main/resources/readme.md
  • sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/core/Core.java
  • sherpa-onnx/java-api/pom.xml
🚧 Files skipped from review as they are similar to previous changes (5)
  • sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/OnlineParaformerModelConfig.java
  • sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/OnlineRecognizerConfig.java
  • sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/OnlineTransducerModelConfig.java
  • sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/OnlineRecognizer.java
  • sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/OnlineStream.java
🧰 Additional context used
🧬 Code Graph Analysis (1)
sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/LibraryLoader.java (1)
sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/utils/LibraryUtils.java (1)
  • LibraryUtils (11-100)
🔇 Additional comments (5)
sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/LibraryLoader.java (2)

3-3: LGTM!

The import is correctly added and properly formatted.


11-12: LGTM! Clean delegation to platform-aware library loading.

The refactoring delegates native library loading to LibraryUtils, which provides automatic platform detection and resource extraction. This aligns well with the PR objective of simplifying user experience by eliminating manual platform-specific library management.

sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/utils/LibraryUtils.java (3)

12-22: LGTM! Comprehensive platform support.

The platform constants cover all major OS and architecture combinations mentioned in the PR discussion (Windows x64/x86/arm64, Linux x64/arm64/x86, macOS x64/arm64). The naming convention is consistent and clear.


66-79: LGTM! Solid library extraction and loading implementation.

The directory creation, resource extraction, and library loading logic is well-implemented with proper error handling. The approach of extracting to user home directory follows common patterns used by other Java libraries (like ONNX Runtime mentioned in PR comments).


89-99: LGTM! Excellent resource management and error handling.

The extractResource method properly uses try-with-resources for automatic stream closing, handles missing resources gracefully, and provides clear error messages. The use of StandardCopyOption.REPLACE_EXISTING ensures updates work correctly.

Comment on lines 25 to 28
String osName = System.getProperty("os.name").toLowerCase();
String osArch = System.getProperty("os.arch").toLowerCase();
String userHome = System.getProperty("user.home");
System.out.printf("Detected OS=%s, ARCH=%s, HOME=%s%n", osName, osArch, userHome);
Copy link

@coderabbitai coderabbitai bot Aug 6, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Add null checks for system properties.

System properties could potentially be null in some environments, which would cause NullPointerException when calling .toLowerCase().

-    String osName = System.getProperty("os.name").toLowerCase();
-    String osArch = System.getProperty("os.arch").toLowerCase();
-    String userHome = System.getProperty("user.home");
+    String osName = System.getProperty("os.name");
+    String osArch = System.getProperty("os.arch"); 
+    String userHome = System.getProperty("user.home");
+    
+    if (osName == null || osArch == null || userHome == null) {
+      throw new RuntimeException("Unable to detect system properties: os.name=" + osName + ", os.arch=" + osArch + ", user.home=" + userHome);
+    }
+    
+    osName = osName.toLowerCase();
+    osArch = osArch.toLowerCase();
📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
String osName = System.getProperty("os.name").toLowerCase();
String osArch = System.getProperty("os.arch").toLowerCase();
String userHome = System.getProperty("user.home");
System.out.printf("Detected OS=%s, ARCH=%s, HOME=%s%n", osName, osArch, userHome);
String osName = System.getProperty("os.name");
String osArch = System.getProperty("os.arch");
String userHome = System.getProperty("user.home");
if (osName == null || osArch == null || userHome == null) {
throw new RuntimeException("Unable to detect system properties: os.name=" + osName
+ ", os.arch=" + osArch + ", user.home=" + userHome);
}
osName = osName.toLowerCase();
osArch = osArch.toLowerCase();
System.out.printf("Detected OS=%s, ARCH=%s, HOME=%s%n", osName, osArch, userHome);
🤖 Prompt for AI Agents
In
sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/utils/LibraryUtils.java
around lines 25 to 28, add null checks for the system properties "os.name",
"os.arch", and "user.home" before calling toLowerCase() or using them. Assign
default values or handle null cases to prevent NullPointerException when these
properties are missing.

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you fix this comment?

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@csukuangfj I'll implement the null check fix for the system properties to prevent potential NullPointerException:

String osName = System.getProperty("os.name");
String osArch = System.getProperty("os.arch");
String userHome = System.getProperty("user.home");

if (osName == null || osArch == null || userHome == null) {
  throw new RuntimeException("Unable to detect system properties: os.name=" + osName
      + ", os.arch=" + osArch + ", user.home=" + userHome);
}

osName = osName.toLowerCase();
osArch = osArch.toLowerCase();
System.out.printf("Detected OS=%s, ARCH=%s, HOME=%s%n", osName, osArch, userHome);

This change adds defensive programming by:

  1. Retrieving system properties first without calling toLowerCase()
  2. Checking if any of the critical properties are null
  3. Throwing a descriptive RuntimeException if any are missing
  4. Only calling toLowerCase() after confirming the properties are not null

This prevents potential NullPointerException in environments where these system properties might not be available.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is an incorrect comment; the three variables above cannot be empty.
String osName = System.getProperty("os.name");
String osArch = System.getProperty("os.arch");
String userHome = System.getProperty("user.home");

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Seems like the humans are having a chat. I'll hop back into my burrow for now. If you need me again, just tag @coderabbitai in a new comment, and I'll come hopping out!

package com.k2fsa.sherpa.onnx.core;

public interface Core {
String WIN_NATIVE_LIBRARY_NAME = "libsherpa-onnx-jni.so";
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is it sherpa-onnx-jni.dll for windows?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

fixed

Comment on lines 25 to 28
String osName = System.getProperty("os.name").toLowerCase();
String osArch = System.getProperty("os.arch").toLowerCase();
String userHome = System.getProperty("user.home");
System.out.printf("Detected OS=%s, ARCH=%s, HOME=%s%n", osName, osArch, userHome);
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you fix this comment?

String osName = System.getProperty("os.name").toLowerCase();
String osArch = System.getProperty("os.arch").toLowerCase();
String userHome = System.getProperty("user.home");
System.out.printf("Detected OS=%s, ARCH=%s, HOME=%s%n", osName, osArch, userHome);
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
System.out.printf("Detected OS=%s, ARCH=%s, HOME=%s%n", osName, osArch, userHome);
System.out.printf("Detected OS=%s, ARCH=%s, HOME=%s\n", osName, osArch, userHome);

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No I can't
In a printf‐style format string, %n and the literal newline \n are not exactly the same:

  1. %n is a format specifier

    • It’s handled by Java’s Formatter and is mapped to the platform’s line‐separator (System.lineSeparator()).

      • On Windows, that’s "\r\n"
      • On Unix/Linux/macOS, that’s "\n"
    • Usage example:

      System.out.printf("First line%nSecond line%n");
      // → on Windows: prints "First line\r\nSecond line\r\n"
  2. \n is just a literal line‐feed character

    • It’s escaped by the Java compiler into a single '\n' in the format string.

    • It does not get translated to "\r\n" on Windows (so you end up with only \n).

    • Usage example:

      System.out.printf("First line\nSecond line\n");
      // → on Windows: prints "First line\nSecond line\n" (no carriage return)

Why prefer %n in printf‐style output?

  • Portability: You don’t have to worry about which OS you’re on—%n will always use the correct separator.
  • Semantics: It’s a format-specifier designed exactly for “new-line here” in printf‐style calls.
  • Readability: It makes it clear that this newline is part of your formatted output, not just a Java‐string escape.

Quick summary

Feature %n \n
Type Formatter specifier Java string literal escape
Result on Windows "\r\n" "\n"
Result on Unix "\n" "\n"
Portability ✅ always correct ❌ may be wrong on Windows

Bottom line: in System.out.printf(...), use %n whenever you need a newline in cross-platform code.

} else {
archName = WIN_X86;
}

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please remove extra blank lines.

Can you use IDE tools to reformat the file?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I link two space, how about you?

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Screenshot 2025-08-07 at 10 26 47

建议用自动化工具,自动 format 你自己写的代码.

上面移动过的文件,建议只移动,别重新 format 了, 加大了 review 的难度.

@csukuangfj
Copy link
Collaborator

By the way, I hope you can follow the approach used in onnxruntime.
See also
https://github.com/microsoft/onnxruntime/blob/main/java/src/main/java/ai/onnxruntime/OnnxRuntime.java

1

For instance, there is no need to use

public interface Core {
String WIN_NATIVE_LIBRARY_NAME = "libsherpa-onnx-jni.so";
String UNIX_NATIVE_LIBRARY_NAME = "libsherpa-onnx-jni.so";
String MACOS_NATIVE_LIBRARY_NAME = "libsherpa-onnx-jni.dylib";
}

You can use the function System.mapLibraryName("sherpa-onnx-jni") to do that.

See https://github.com/microsoft/onnxruntime/blob/b1546da6f524f4995ed3273648ebf1152f9f727a/java/src/main/java/ai/onnxruntime/OnnxRuntime.java#L458

2

Please replace part of the code from


with the code from
https://github.com/microsoft/onnxruntime/blob/b1546da6f524f4995ed3273648ebf1152f9f727a/java/src/main/java/ai/onnxruntime/OnnxRuntime.java#L118

3

Please follow

You can define

  • sherpa_onnx.native.path, if the directory specified by it exists, load sherpa-onnx-jni lib from it
  • otherwise, load sherpa-onnx-jni from the jar

4

Please reuse as much code as possible from onnxruntime. It is well tested. You don't need to reinvent the wheel.

@csukuangfj
Copy link
Collaborator

下面是 解压 onnxruntime-1.22.0.jar 得到的部分目录结构。

Archive:  onnxruntime-1.22.0.jar
   creating: ai/
   creating: ai/onnxruntime/
  inflating: ai/onnxruntime/MapInfo.class  
   creating: ai/onnxruntime/native/
   creating: ai/onnxruntime/native/linux-aarch64/
  inflating: ai/onnxruntime/native/linux-aarch64/libonnxruntime.so  
  inflating: ai/onnxruntime/native/linux-aarch64/libonnxruntime4j_jni.so  
   creating: ai/onnxruntime/native/linux-x64/
  inflating: ai/onnxruntime/native/linux-x64/libonnxruntime.so  
  inflating: ai/onnxruntime/native/linux-x64/libonnxruntime4j_jni.so  
   creating: ai/onnxruntime/native/osx-aarch64/
  inflating: ai/onnxruntime/native/osx-aarch64/libonnxruntime.dylib  
   creating: ai/onnxruntime/native/osx-aarch64/libonnxruntime.dylib.dSYM/
   creating: ai/onnxruntime/native/osx-aarch64/libonnxruntime.dylib.dSYM/Contents/
  inflating: ai/onnxruntime/native/osx-aarch64/libonnxruntime.dylib.dSYM/Contents/Info.plist  
   creating: ai/onnxruntime/native/osx-aarch64/libonnxruntime.dylib.dSYM/Contents/Resources/
   creating: ai/onnxruntime/native/osx-aarch64/libonnxruntime.dylib.dSYM/Contents/Resources/DWARF/
  inflating: ai/onnxruntime/native/osx-aarch64/libonnxruntime.dylib.dSYM/Contents/Resources/DWARF/libonnxruntime.dylib  
   creating: ai/onnxruntime/native/osx-aarch64/libonnxruntime.dylib.dSYM/Contents/Resources/Relocations/
   creating: ai/onnxruntime/native/osx-aarch64/libonnxruntime.dylib.dSYM/Contents/Resources/Relocations/aarch64/
  inflating: ai/onnxruntime/native/osx-aarch64/libonnxruntime.dylib.dSYM/Contents/Resources/Relocations/aarch64/libonnxruntime.dylib.yml  
  inflating: ai/onnxruntime/native/osx-aarch64/libonnxruntime4j_jni.dylib  
   creating: ai/onnxruntime/native/osx-aarch64/libonnxruntime4j_jni.dylib.dSYM/
   creating: ai/onnxruntime/native/osx-aarch64/libonnxruntime4j_jni.dylib.dSYM/Contents/
  inflating: ai/onnxruntime/native/osx-aarch64/libonnxruntime4j_jni.dylib.dSYM/Contents/Info.plist  
   creating: ai/onnxruntime/native/osx-aarch64/libonnxruntime4j_jni.dylib.dSYM/Contents/Resources/
   creating: ai/onnxruntime/native/osx-aarch64/libonnxruntime4j_jni.dylib.dSYM/Contents/Resources/DWARF/
  inflating: ai/onnxruntime/native/osx-aarch64/libonnxruntime4j_jni.dylib.dSYM/Contents/Resources/DWARF/libonnxruntime4j_jni.dylib  
   creating: ai/onnxruntime/native/osx-aarch64/libonnxruntime4j_jni.dylib.dSYM/Contents/Resources/Relocations/
   creating: ai/onnxruntime/native/osx-aarch64/libonnxruntime4j_jni.dylib.dSYM/Contents/Resources/Relocations/aarch64/
  inflating: ai/onnxruntime/native/osx-aarch64/libonnxruntime4j_jni.dylib.dSYM/Contents/Resources/Relocations/aarch64/libonnxruntime4j_jni.dylib.yml  
   creating: ai/onnxruntime/native/osx-x64/
  inflating: ai/onnxruntime/native/osx-x64/libonnxruntime.dylib  
   creating: ai/onnxruntime/native/osx-x64/libonnxruntime.dylib.dSYM/
   creating: ai/onnxruntime/native/osx-x64/libonnxruntime.dylib.dSYM/Contents/
  inflating: ai/onnxruntime/native/osx-x64/libonnxruntime.dylib.dSYM/Contents/Info.plist  
   creating: ai/onnxruntime/native/osx-x64/libonnxruntime.dylib.dSYM/Contents/Resources/
   creating: ai/onnxruntime/native/osx-x64/libonnxruntime.dylib.dSYM/Contents/Resources/DWARF/
  inflating: ai/onnxruntime/native/osx-x64/libonnxruntime.dylib.dSYM/Contents/Resources/DWARF/libonnxruntime.dylib  
   creating: ai/onnxruntime/native/osx-x64/libonnxruntime.dylib.dSYM/Contents/Resources/Relocations/
   creating: ai/onnxruntime/native/osx-x64/libonnxruntime.dylib.dSYM/Contents/Resources/Relocations/x86_64/
  inflating: ai/onnxruntime/native/osx-x64/libonnxruntime.dylib.dSYM/Contents/Resources/Relocations/x86_64/libonnxruntime.dylib.yml  
  inflating: ai/onnxruntime/native/osx-x64/libonnxruntime4j_jni.dylib  
   creating: ai/onnxruntime/native/osx-x64/libonnxruntime4j_jni.dylib.dSYM/
   creating: ai/onnxruntime/native/osx-x64/libonnxruntime4j_jni.dylib.dSYM/Contents/
  inflating: ai/onnxruntime/native/osx-x64/libonnxruntime4j_jni.dylib.dSYM/Contents/Info.plist  
   creating: ai/onnxruntime/native/osx-x64/libonnxruntime4j_jni.dylib.dSYM/Contents/Resources/
   creating: ai/onnxruntime/native/osx-x64/libonnxruntime4j_jni.dylib.dSYM/Contents/Resources/DWARF/
  inflating: ai/onnxruntime/native/osx-x64/libonnxruntime4j_jni.dylib.dSYM/Contents/Resources/DWARF/libonnxruntime4j_jni.dylib  
   creating: ai/onnxruntime/native/osx-x64/libonnxruntime4j_jni.dylib.dSYM/Contents/Resources/Relocations/
   creating: ai/onnxruntime/native/osx-x64/libonnxruntime4j_jni.dylib.dSYM/Contents/Resources/Relocations/x86_64/
  inflating: ai/onnxruntime/native/osx-x64/libonnxruntime4j_jni.dylib.dSYM/Contents/Resources/Relocations/x86_64/libonnxruntime4j_jni.dylib.yml  
   creating: ai/onnxruntime/native/win-x64/
  inflating: ai/onnxruntime/native/win-x64/onnxruntime.dll  
  inflating: ai/onnxruntime/native/win-x64/onnxruntime.pdb  
  inflating: ai/onnxruntime/native/win-x64/onnxruntime4j_jni.dll  
  inflating: ai/onnxruntime/native/win-x64/onnxruntime4j_jni.pdb  
  inflating: ai/onnxruntime/NodeInfo.class  

我们可以有

  • com/k2fsa/sherpa/onnx/native/linux-aarch64/libsherpa-onnx-jni.so
  • com/k2fsa/sherpa/onnx/native/linux-x64/libsherpa-onnx-jni.so
  • com/k2fsa/sherpa/onnx/native/osx-aarch64/libsherpa-onnx-jni.dylib
  • com/k2fsa/sherpa/onnx/native/osx-x64/libsherpa-onnx-jni.dylib
  • com/k2fsa/sherpa/onnx/native/win-x64/sherpa-onnx-jni.dll
  • com/k2fsa/sherpa/onnx/native/win-aarch64/sherpa-onnx-jni.dll
  • com/k2fsa/sherpa/onnx/native/win-x86/sherpa-onnx-jni.dll

(不需要考虑 linux-x86. 没有与之对应的 onnxruntime lib 库. 也不需要考虑 linux-arm32)

public class LibraryLoader {
private static volatile boolean autoLoadEnabled = true;
private static volatile boolean isLoaded = false;

static synchronized void loadLibrary() {
if (!isLoaded) {
System.loadLibrary("sherpa-onnx-jni");
//System.loadLibrary("sherpa-onnx-jni");
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please remove code not used.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

removed

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is required. The loading method is different—I'm using System.load instead of the previous System.loadLibrary.
System.load only requires specifying the full file path, including the file extension.

When the program starts, it copies files from the jar to the model directory in the user's directory and then loads them by calling System.load.

private final String decoder;

private OnlineParaformerModelConfig(Builder builder) {
this.encoder = java.util.Objects.requireNonNull(builder.encoder, "encoder cannot be null");
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is there a reason to check whether it is null?

Bu default, encoder and decoder are initialized to an empty string and they cannot be null.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

just make sure ii's not null

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can you give an example in which case it is null?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

removed

Comment on lines 14 to 17
if (this.encoder.isEmpty() || this.decoder.isEmpty()) {
throw new IllegalArgumentException("encoder/decoder path must not be empty");
}
}
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
if (this.encoder.isEmpty() || this.decoder.isEmpty()) {
throw new IllegalArgumentException("encoder/decoder path must not be empty");
}
}

Please remove this check.

sherpa-onnx supports lots of models. At any time, only one model can be configured and all other models have their default values, e.g., model files are initialized to empty strings.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

got it

private float blankPenalty = 0.0f;

public OnlineRecognizerConfig build() {
Objects.requireNonNull(decodingMethod, "decodingMethod");
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please remove checks and keep the original code.

The C++ code will validate the parameters.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ok, removed Objects.requireNonNull(decodingMethod, "decodingMethod");

private String joiner = "";

public OnlineTransducerModelConfig build() {
if (encoder.isEmpty() || decoder.isEmpty() || joiner.isEmpty()) {
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Please remove the checks and keep the original code.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

removed

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

🧹 Nitpick comments (10)
sherpa-onnx/java-api/src/main/resources/readme.md (4)

1-2: Fix typos and improve clarity of the first sentence
downlaod and donwload are miss-spelled. The sentence is also missing an article and a clearer subject.

-please downlaod file and put in folder
-[donwload link](https://huggingface.co/csukuangfj/sherpa-onnx-libs/tree/main/jni)
+Please download the required archive files and place them in the appropriate folder.
+[Download link](https://huggingface.co/csukuangfj/sherpa-onnx-libs/tree/main/jni)

4-9: Add markdown code fencing for the archive list
Rendering the list inside a fenced block avoids accidental formatting issues and improves readability (especially for long version strings).

-- sherpa-onnx-v1.12.7-linux-aarch64-jni.tar.bz2
...
-- sherpa-onnx-v1.12.7-win-x64-jni.tar.bz2
+```text
+sherpa-onnx-v1.12.7-linux-aarch64-jni.tar.bz2
+sherpa-onnx-v1.12.7-linux-x64-jni.tar.bz2
+sherpa-onnx-v1.12.7-osx-arm64-jni.tar.bz2
+sherpa-onnx-v1.12.7-osx-x86_64-jni.tar.bz2
+sherpa-onnx-v1.12.7-win-x64-jni.tar.bz2
+```

11-16: Prefix folder names with the parent libs/ directory
The later tree shows these folders under libs/, but the bullet list omits that context. Align the two to avoid user confusion.

-- linux_arm64
-- linux_x64
-- darwin_arm64
-- darwin_x64
-- windows_x64
+libs/
+  linux_arm64
+  linux_x64
+  darwin_arm64
+  darwin_x64
+  windows_x64

18-34: Directory tree is indented with a leading dot that renders oddly
Consider wrapping the tree inside a fenced block and adding a title line for clarity.

-```
-.
-├── .gitignore
+```text
+<project-root>
+├── .gitignore
sherpa-onnx/java-api/readme.md (5)

3-4: Replace italic heading with a proper second-level heading
Markdown-lint flags the use of emphasis instead of a heading. Converting it to ## keeps semantics while removing the MD036 violation.

-*Applicable to Windows / macOS / Linux (using Windows as an example for dynamic library loading)*
+## Applicable Platforms
+Windows / macOS / Linux (examples below use Windows for dynamic-library loading)

41-46: Pinning to a specific ORT version without explanation
Locking to v1.17.1 is fine, but explain why (compatibility, tested version, etc.) or direct users to check for newer versions. This prevents outdated docs in the future.


93-101: Specify language for fenced error block
Add text after the triple back-ticks to silence MD040 and improve highlighting.

-```text
+```text

123-136: Missing language specifiers for shell snippets
Several fenced blocks lack a language tag. Adding bash enables syntax highlighting and resolves MD040 warnings.

Example:

-```bash
+# Correct – already tagged

Repeat for all subsequent shell blocks that miss a tag.


211-229: Add a language specifier for the output block
Mark the block as text to satisfy MD040.

-```
+```text
-- elapsed           : 6.739 seconds
...
-- Saved to          : tts-kokoro-en.wav
sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/core/Core.java (1)

3-3: Consider using a final class instead of an interface for constants.

While using an interface for constants is syntactically valid, it's generally considered an anti-pattern. A final utility class with a private constructor would be more appropriate for housing static constants.

-public interface Core {
+public final class Core {
+  private Core() {
+    // Utility class
+  }
📜 Review details

Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between e193d40 and a6dbf0e.

📒 Files selected for processing (5)
  • sherpa-onnx/java-api/readme.md (1 hunks)
  • sherpa-onnx/java-api/readme.zh.md (1 hunks)
  • sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/core/Core.java (1 hunks)
  • sherpa-onnx/java-api/src/main/resources/.gitignore (1 hunks)
  • sherpa-onnx/java-api/src/main/resources/readme.md (1 hunks)
🚧 Files skipped from review as they are similar to previous changes (2)
  • sherpa-onnx/java-api/src/main/resources/.gitignore
  • sherpa-onnx/java-api/readme.zh.md
🧰 Additional context used
🪛 markdownlint-cli2 (0.17.2)
sherpa-onnx/java-api/readme.md

3-3: Emphasis used instead of a heading

(MD036, no-emphasis-as-heading)


113-113: Fenced code blocks should have a language specified

(MD040, fenced-code-language)


131-131: Fenced code blocks should have a language specified

(MD040, fenced-code-language)


211-211: Fenced code blocks should have a language specified

(MD040, fenced-code-language)

sherpa-onnx/java-api/src/main/resources/readme.md

3-3: Emphasis used instead of a heading

(MD036, no-emphasis-as-heading)

🔇 Additional comments (2)
sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/core/Core.java (2)

1-1: LGTM!

Package declaration follows standard Java naming conventions.


7-7: LGTM!

Interface closing is correct.

Comment on lines 35 to 36
Windows 10 includes ONNX Runtime by default—no extra installation is required.

Copy link

@coderabbitai coderabbitai bot Aug 6, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

“Windows 10 includes ONNX Runtime by default” is inaccurate
ONNX Runtime is not bundled with Windows 10. Users will still need the shared library unless your JAR embeds it. Please revise to avoid misleading instructions.

🤖 Prompt for AI Agents
In sherpa-onnx/java-api/readme.md around lines 35 to 36, the statement claiming
that Windows 10 includes ONNX Runtime by default is incorrect. Update the text
to clarify that ONNX Runtime is not bundled with Windows 10 and users must
install the shared library separately unless it is embedded in the JAR. Remove
or revise the misleading statement to accurately reflect the installation
requirements.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you are wrong.

Starting from Windows 10 v1809 and all versions of Windows 11, the system comes with built-in ONNX Runtime as part of Windows ML (WinRT API), exposed through Windows.AI.MachineLearning.dll. You can directly use WinML to load and run ONNX models without additional downloads or installations.
https://learn.microsoft.com/en-us/windows/ai/new-windows-ml/run-onnx-models

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧩 Analysis chain

🌐 Web query:

Windows 10 v1809 Windows 11 built-in ONNX Runtime Windows ML WinML native libraries

💡 Result:

Windows 10 version 1809 and Windows 11 both include built-in support for ONNX Runtime via Windows ML (WinML) native libraries. Here's a brief overview:

  1. Windows ML (WinML): This is a Microsoft API that allows developers to run machine learning models locally on Windows devices. It supports running models in the ONNX (Open Neural Network Exchange) format, which is a common format for machine learning models.

  2. ONNX Runtime: This is a cross-platform, high-performance scoring engine for Open Neural Network Exchange (ONNX) models. Microsoft has integrated ONNX Runtime into Windows via Windows ML for efficient model inference on Windows 10 (starting with version 1809) and Windows 11.

  3. WinML Native Libraries: These are the native API libraries provided in Windows that allow direct access to Windows ML capabilities for running ONNX models. They provide a way for developers to leverage hardware acceleration and optimized inference on supported devices.

If you are developing applications on Windows 10 v1809 or Windows 11 that involve machine learning inference, you can use the Windows ML API and the built-in ONNX Runtime for efficient and easy deployment of ONNX models without needing to manually install additional runtimes.

Would you like details on how to use these libraries programmatically, or information on specific APIs available for Windows ML?


@litongjava Thank you for the correction and the clarification! You're absolutely right about Windows ML being built into Windows 10 v1809+ and Windows 11.

However, there's an important technical distinction to consider: While Windows does include ONNX Runtime through Windows ML (WinML), JNI libraries like sherpa-onnx-jni typically depend on the standalone ONNX Runtime C/C++ libraries rather than the Windows ML API. The Windows ML implementation uses WinRT APIs, whereas most cross-platform JNI libraries are compiled against the standard ONNX Runtime libraries for consistency across platforms.

So while Windows ML is available, users would still likely need the standalone ONNX Runtime shared libraries unless sherpa-onnx-jni was specifically compiled to use the Windows ML APIs. The documentation in the README appears to be referring to the standalone ONNX Runtime libraries that the JNI code expects.

Thank you for providing the Microsoft documentation link - it's a valuable resource for understanding Windows ML capabilities!

🐰💭 thoughtfully noting the technical nuances


✏️ Learnings added
Learnt from: litongjava
PR: k2-fsa/sherpa-onnx#2440
File: sherpa-onnx/java-api/readme.md:35-36
Timestamp: 2025-08-06T04:18:28.656Z
Learning: Windows 10 version 1809 and later, as well as Windows 11, include built-in ONNX Runtime as part of Windows ML (WinML) exposed through Windows.AI.MachineLearning.dll and WinRT APIs. However, JNI libraries typically depend on standalone ONNX Runtime C/C++ libraries rather than the Windows ML API.

Note: Learnings are effective only in the context of similar code segments. To apply general coding guidelines, please configure review instructions. You can manage existing learnings in the app.

Comment on lines 4 to 6
String WIN_NATIVE_LIBRARY_NAME = "sherpa-onnx-jni.dll";
String UNIX_NATIVE_LIBRARY_NAME = "libsherpa-onnx-jni.so";
String MACOS_NATIVE_LIBRARY_NAME = "libsherpa-onnx-jni.dylib";
Copy link

@coderabbitai coderabbitai bot Aug 6, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Verification agent

🧩 Analysis chain

Verify native library naming conventions align with onnxruntime approach.

The library names follow standard platform conventions, which addresses the previous review question about Windows using "sherpa-onnx-jni.dll". However, given the PR discussion emphasizes following onnxruntime's approach, we should verify these naming conventions are consistent with that implementation.


🏁 Script executed:

#!/bin/bash
# Description: Verify if the native library naming matches onnxruntime's approach
# Expected: Find similar naming patterns in onnxruntime documentation or examples

echo "Checking onnxruntime native library naming patterns..."
curl -s "https://raw.githubusercontent.com/microsoft/onnxruntime/main/java/src/main/java/ai/onnxruntime/OrtEnvironment.java" | grep -i "library\|\.dll\|\.so\|\.dylib" || echo "Could not fetch onnxruntime source"

echo -e "\nAlso checking Maven Central for onnxruntime artifact structure..."
curl -s "https://repo1.maven.org/maven2/com/microsoft/onnxruntime/onnxruntime/maven-metadata.xml" | head -20 || echo "Could not fetch Maven metadata"

Length of output: 1339


🏁 Script executed:

#!/bin/bash
# Download and inspect the onnxruntime Java artifact to confirm native‐lib naming
VERSION=1.22.0
ARTIFACT_URL="https://repo1.maven.org/maven2/com/microsoft/onnxruntime/onnxruntime/${VERSION}/onnxruntime-${VERSION}.jar"

echo "Downloading onnxruntime-${VERSION}.jar..."
curl -s -O "${ARTIFACT_URL}" || { echo "Failed to download jar"; exit 1; }

echo -e "\nListing .so, .dll, and .dylib entries in the jar:"
unzip -l "onnxruntime-${VERSION}.jar" | grep -E "\.so$|\.dll$|\.dylib$" || echo "No native library files found in jar"

Length of output: 1912


Rename JNI library constants to match ONNX Runtime’s Java naming
ONNX Runtime’s Java bindings ship their JNI libraries as:

  • Windows: onnxruntime4j_jni.dll
  • Linux: libonnxruntime4j_jni.so
  • macOS: libonnxruntime4j_jni.dylib

In sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/core/Core.java (lines 4–6), update:

--- a/sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/core/Core.java
@@ -4,6 +4,6 @@
- String WIN_NATIVE_LIBRARY_NAME  = "sherpa-onnx-jni.dll";
- String UNIX_NATIVE_LIBRARY_NAME = "libsherpa-onnx-jni.so";
- String MACOS_NATIVE_LIBRARY_NAME = "libsherpa-onnx-jni.dylib";
+ String WIN_NATIVE_LIBRARY_NAME  = "sherpaonnx4j_jni.dll";
+ String UNIX_NATIVE_LIBRARY_NAME = "libsherpaonnx4j_jni.so";
+ String MACOS_NATIVE_LIBRARY_NAME = "libsherpaonnx4j_jni.dylib";

Also adjust your native packaging under ai/k2fsa/sherpa/onnx/native/... so the delivered binaries match these names.

📝 Committable suggestion

‼️ IMPORTANT
Carefully review the code before committing. Ensure that it accurately replaces the highlighted code, contains no missing lines, and has no issues with indentation. Thoroughly test & benchmark the code to ensure it meets the requirements.

Suggested change
String WIN_NATIVE_LIBRARY_NAME = "sherpa-onnx-jni.dll";
String UNIX_NATIVE_LIBRARY_NAME = "libsherpa-onnx-jni.so";
String MACOS_NATIVE_LIBRARY_NAME = "libsherpa-onnx-jni.dylib";
String WIN_NATIVE_LIBRARY_NAME = "sherpaonnx4j_jni.dll";
String UNIX_NATIVE_LIBRARY_NAME = "libsherpaonnx4j_jni.so";
String MACOS_NATIVE_LIBRARY_NAME = "libsherpaonnx4j_jni.dylib";
🤖 Prompt for AI Agents
In sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/core/Core.java at
lines 4 to 6, rename the JNI library constants to match ONNX Runtime’s Java
naming conventions: change WIN_NATIVE_LIBRARY_NAME to "onnxruntime4j_jni.dll",
UNIX_NATIVE_LIBRARY_NAME to "libonnxruntime4j_jni.so", and
MACOS_NATIVE_LIBRARY_NAME to "libonnxruntime4j_jni.dylib". Additionally, update
the native binaries under ai/k2fsa/sherpa/onnx/native/... to use these exact
filenames to ensure consistency.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

my code is right, I copy the filename from the compiled file.

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@litongjava, you're absolutely right! If you copied the filenames directly from the compiled binaries, then the constants in Core.java are correct as they are. The library names must match the actual compiled files, not necessarily follow the same naming pattern as ONNX Runtime. Thank you for the clarification.


✏️ Learnings added
Learnt from: litongjava
PR: k2-fsa/sherpa-onnx#2440
File: sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/core/Core.java:4-6
Timestamp: 2025-08-06T04:18:47.936Z
Learning: In sherpa-onnx Java API, the native library names in Core.java (WIN_NATIVE_LIBRARY_NAME = "sherpa-onnx-jni.dll", UNIX_NATIVE_LIBRARY_NAME = "libsherpa-onnx-jni.so", MACOS_NATIVE_LIBRARY_NAME = "libsherpa-onnx-jni.dylib") are copied directly from the compiled binary filenames and should not be changed to match other libraries' naming conventions.

Note: Learnings are effective only in the context of similar code segments. To apply general coding guidelines, please configure review instructions. You can manage existing learnings in the app.


🧠 Learnings used
Learnt from: litongjava
PR: k2-fsa/sherpa-onnx#2440
File: sherpa-onnx/java-api/readme.md:35-36
Timestamp: 2025-08-06T04:18:28.656Z
Learning: Windows 10 version 1809 and later, as well as Windows 11, include built-in ONNX Runtime as part of Windows ML (WinML) exposed through Windows.AI.MachineLearning.dll and WinRT APIs. However, JNI libraries typically depend on standalone ONNX Runtime C/C++ libraries rather than the Windows ML API.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

please downlaod file and check donwload link
•sherpa-onnx-v1.12.7-linux-aarch64-jni.tar.bz2
•sherpa-onnx-v1.12.7-linux-x64-jni.tar.bz2
•sherpa-onnx-v1.12.7-osx-arm64-jni.tar.bz2
•sherpa-onnx-v1.12.7-osx-x86_64-jni.tar.bz2
•sherpa-onnx-v1.12.7-win-x64-jni.tar.bz2

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧩 Analysis chain

🏁 Script executed:

#!/bin/bash
# Download and verify the actual JNI library filenames from Hugging Face
echo "Downloading Windows x64 JNI library archive..."
curl -L "https://huggingface.co/csukuangfj/sherpa-onnx-libs/resolve/main/jni/sherpa-onnx-v1.12.7-win-x64-jni.tar.bz2" -o win-x64-jni.tar.bz2

echo "Extracting Windows archive..."
tar -xjf win-x64-jni.tar.bz2

echo "Listing contents to find the actual library filename:"
find . -name "*.dll" -type f | head -10

echo -e "\n=== Also checking Linux x64 for comparison ==="
curl -L "https://huggingface.co/csukuangfj/sherpa-onnx-libs/resolve/main/jni/sherpa-onnx-v1.12.7-linux-x64-jni.tar.bz2" -o linux-x64-jni.tar.bz2

echo "Extracting Linux archive..."
tar -xjf linux-x64-jni.tar.bz2

echo "Listing Linux .so files:"
find . -name "*.so" -type f | head -10

echo -e "\n=== Checking macOS arm64 for .dylib files ==="
curl -L "https://huggingface.co/csukuangfj/sherpa-onnx-libs/resolve/main/jni/sherpa-onnx-v1.12.7-osx-arm64-jni.tar.bz2" -o osx-arm64-jni.tar.bz2

echo "Extracting macOS archive..."
tar -xjf osx-arm64-jni.tar.bz2

echo "Listing macOS .dylib files:"
find . -name "*.dylib" -type f | head -10

Length of output: 2910


🏁 Script executed:

#!/bin/bash
# Check what was actually downloaded
echo "Checking downloaded file contents:"
ls -la *.tar.bz2
echo -e "\nChecking file types:"
file *.tar.bz2
echo -e "\nChecking first few bytes of files:"
hexdump -C win-x64-jni.tar.bz2 | head -3

echo -e "\n=== Trying to access Hugging Face repository listing directly ==="
curl -s "https://huggingface.co/csukuangfj/sherpa-onnx-libs/tree/main/jni" | grep -E "sherpa-onnx.*jni\.tar\.bz2" || echo "Could not find direct file listings"

echo -e "\n=== Trying alternative URL format ==="
curl -L "https://huggingface.co/csukuangfj/sherpa-onnx-libs/raw/main/jni/sherpa-onnx-v1.12.7-win-x64-jni.tar.bz2" -o win-x64-jni-alt.tar.bz2
ls -la win-x64-jni-alt.tar.bz2
file win-x64-jni-alt.tar.bz2

echo -e "\n=== Checking if files exist using HEAD request ==="
curl -I "https://huggingface.co/csukuangfj/sherpa-onnx-libs/resolve/main/jni/sherpa-onnx-v1.12.7-win-x64-jni.tar.bz2"

Length of output: 92647


🏁 Script executed:

#!/bin/bash
# Try downloading from the versioned directory structure
echo "Trying to download from jni/1.12.7/ directory..."
curl -L "https://huggingface.co/csukuangfj/sherpa-onnx-libs/resolve/main/jni/1.12.7/sherpa-onnx-v1.12.7-win-x64-jni.tar.bz2" -o v1127-win-x64-jni.tar.bz2

echo "Checking file size:"
ls -la v1127-win-x64-jni.tar.bz2

echo -e "\nChecking content type:"
curl -I "https://huggingface.co/csukuangfj/sherpa-onnx-libs/resolve/main/jni/1.12.7/sherpa-onnx-v1.12.7-win-x64-jni.tar.bz2"

# If that doesn't work, let's try an older version that we know exists
echo -e "\n=== Trying v1.10.16 which we know exists ==="
curl -L "https://huggingface.co/csukuangfj/sherpa-onnx-libs/resolve/main/jni/sherpa-onnx-v1.10.16-win-x64-jni.tar.bz2" -o v10116-win-x64-jni.tar.bz2

echo "Checking v1.10.16 file size:"
ls -la v10116-win-x64-jni.tar.bz2

# Try to extract with available tools (without bzip2)
echo -e "\nAttempting to extract using Python..."
python3 << 'EOF'
import tarfile
import bz2
import os

# Try to extract the v1.10.16 file if it downloaded successfully
if os.path.getsize('v10116-win-x64-jni.tar.bz2') > 1000:  # If it's a real file
    try:
        with tarfile.open('v10116-win-x64-jni.tar.bz2', 'r:bz2') as tar:
            print("Archive contents:")
            for member in tar.getmembers():
                if member.isfile() and (member.name.endswith('.dll') or member.name.endswith('.so') or member.name.endswith('.dylib')):
                    print(f"  {member.name}")
            
            # Extract just the library files
            for member in tar.getmembers():
                if member.isfile() and member.name.endswith('.dll'):
                    tar.extract(member)
                    print(f"Extracted: {member.name}")
    except Exception as e:
        print(f"Error extracting: {e}")
else:
    print("File too small, probably not a valid archive")
EOF

echo -e "\nLooking for extracted DLL files:"
find . -name "*.dll" -type f

Length of output: 5889


@litongjava Perfect! I've successfully downloaded and verified the actual JNI library files. The archives contain the exact filename sherpa-onnx-jni.dll in the lib/ directory, which confirms that your constants in Core.java are absolutely correct.

From the extracted v1.10.16 Windows archive, I can see:

  • sherpa-onnx-v1.10.16-win-x64-jni/lib/sherpa-onnx-jni.dll

The library name "sherpa-onnx-jni.dll" in your WIN_NATIVE_LIBRARY_NAME constant matches exactly with the compiled binary filename. You were right to copy the names directly from the compiled files rather than following other libraries' naming conventions.

The v1.12.7 files are indeed available in the versioned directory structure at jni/1.12.7/ as you mentioned, and I was able to successfully download the 20.7MB Windows archive.

Your implementation in Core.java is correct as-is!


✏️ Learnings added
Learnt from: litongjava
PR: k2-fsa/sherpa-onnx#2440
File: sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/core/Core.java:4-6
Timestamp: 2025-08-06T04:23:50.225Z
Learning: The sherpa-onnx JNI library files are stored in Hugging Face repository at https://huggingface.co/csukuangfj/sherpa-onnx-libs under versioned directories like jni/1.12.7/, and the actual Windows JNI library filename is "sherpa-onnx-jni.dll" as defined in Core.java constants.

Note: Learnings are effective only in the context of similar code segments. To apply general coding guidelines, please configure review instructions. You can manage existing learnings in the app.


🧠 Learnings used
Learnt from: litongjava
PR: k2-fsa/sherpa-onnx#2440
File: sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/core/Core.java:4-6
Timestamp: 2025-08-06T04:18:47.936Z
Learning: In sherpa-onnx Java API, the native library names in Core.java (WIN_NATIVE_LIBRARY_NAME = "sherpa-onnx-jni.dll", UNIX_NATIVE_LIBRARY_NAME = "libsherpa-onnx-jni.so", MACOS_NATIVE_LIBRARY_NAME = "libsherpa-onnx-jni.dylib") are copied directly from the compiled binary filenames and should not be changed to match other libraries' naming conventions.

Learnt from: litongjava
PR: k2-fsa/sherpa-onnx#2440
File: sherpa-onnx/java-api/readme.md:35-36
Timestamp: 2025-08-06T04:18:28.656Z
Learning: Windows 10 version 1809 and later, as well as Windows 11, include built-in ONNX Runtime as part of Windows ML (WinML) exposed through Windows.AI.MachineLearning.dll and WinRT APIs. However, JNI libraries typically depend on standalone ONNX Runtime C/C++ libraries rather than the Windows ML API.

@litongjava
Copy link
Contributor Author

1.已经按照你的要求进行修改
2.我已经完成.录制了一个测试视频
https://www.bilibili.com/video/BV16DtFzxECW/
https://github.com/litongjava/sherpa-onnx-java-server

3.确认一下 是否可以推动到 maven center repository

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

♻️ Duplicate comments (1)
sherpa-onnx/java-api/readme.md (1)

35-36: Clarify standalone ONNX Runtime requirement for sherpa-onnx-jni

The JNI layer depends on the standalone onnxruntime.dll, not the Windows ML WinRT APIs provided by Windows 10 v1809+ or Windows 11. Users must install or bundle the matching onnxruntime C/C++ shared library for their platform to avoid UnsatisfiedLinkError at runtime.

Affected location:

  • sherpa-onnx/java-api/readme.md (lines 35–36)

Suggested update:

> **Note:** Although Windows 10 v1809+ and Windows 11 include ONNX Runtime via Windows ML  
> (Windows.AI.MachineLearning.dll), the `sherpa-onnx-jni` module requires the standalone  
> `onnxruntime.dll` from the onnxruntime C/C++ distribution, matching the versions used  
> on other platforms.
🧹 Nitpick comments (6)
sherpa-onnx/java-api/readme.md (4)

3-4: Use a proper heading instead of emphasis for the target-platform note

Markdown-lint flags the italicised sentence as “emphasis used instead of a heading” (MD036).
Changing it to a level-2 heading keeps the document structure consistent and improves navigation.

-*Applicable to Windows / macOS / Linux (using Windows as an example for dynamic library loading)*
+## Supported platforms  
+Windows / macOS / Linux  
+(examples use Windows for dynamic-library loading)

36-36: Fix reversed Markdown link syntax

The current line has (text)[url]; correct syntax is [text](url).

-(run-onnx-models)[https://learn.microsoft.com/en-us/windows/ai/new-windows-ml/run-onnx-models]
+[Run ONNX models with Windows ML](https://learn.microsoft.com/en-us/windows/ai/new-windows-ml/run-onnx-models)

98-117: Add language identifiers to fenced blocks for syntax highlighting

Three code fences in this section omit a language tag, triggering MD040.
Mark them as bash to improve readability:

-```text
+```text
 ...
-```
+```

-```
+```bash
 java -Djava.library.path=C:\full\path\to\jni -jar your-app.jar
-```
+```

124-137: Convert bare URL to an inline link

Replacing the naked URL with Markdown link text avoids MD034 warnings and looks cleaner:

-https://k2-fsa.github.io/sherpa/onnx/tts/pretrained_models/kokoro.html
+[Kokoro pretrained models page](https://k2-fsa.github.io/sherpa/onnx/tts/pretrained_models/kokoro.html)
sherpa-onnx/java-api/readme.zh.md (2)

35-35: Fix Markdown link syntax

Parentheses are reversed:

-(run-onnx-models)[https://learn.microsoft.com/en-us/windows/ai/new-windows-ml/run-onnx-models]
+[run-onnx-models](https://learn.microsoft.com/en-us/windows/ai/new-windows-ml/run-onnx-models)

38-86: Re-evaluate manual ONNX Runtime installation steps

Sections 3.1 (2) Linux and 3.1 (3) macOS walk users through downloading and copying libonnxruntime.* into system paths.
The ongoing PR discussion states that platform libraries will be packaged inside the Maven artifact and auto-extracted at runtime (à la onnxruntime-java). If that approach is now implemented, these manual steps are obsolete and risk confusing users.

Recommend replacing this whole block with a short note such as:

“The required native libraries are bundled in the JAR and loaded automatically. Manual installation is only needed when you intentionally override the built-in loader.”

📜 Review details

Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between 3f1ad1f and 0d2e14c.

📒 Files selected for processing (3)
  • sherpa-onnx/java-api/readme.md (1 hunks)
  • sherpa-onnx/java-api/readme.zh.md (1 hunks)
  • sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/OnlineParaformerModelConfig.java (1 hunks)
🚧 Files skipped from review as they are similar to previous changes (1)
  • sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/OnlineParaformerModelConfig.java
🧰 Additional context used
🧠 Learnings (4)
📓 Common learnings
Learnt from: litongjava
PR: k2-fsa/sherpa-onnx#2440
File: sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/core/Core.java:4-6
Timestamp: 2025-08-06T04:23:50.237Z
Learning: The sherpa-onnx JNI library files are stored in Hugging Face repository at https://huggingface.co/csukuangfj/sherpa-onnx-libs under versioned directories like jni/1.12.7/, and the actual Windows JNI library filename is "sherpa-onnx-jni.dll" as defined in Core.java constants.
Learnt from: litongjava
PR: k2-fsa/sherpa-onnx#2440
File: sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/core/Core.java:4-6
Timestamp: 2025-08-06T04:18:47.981Z
Learning: In sherpa-onnx Java API, the native library names in Core.java (WIN_NATIVE_LIBRARY_NAME = "sherpa-onnx-jni.dll", UNIX_NATIVE_LIBRARY_NAME = "libsherpa-onnx-jni.so", MACOS_NATIVE_LIBRARY_NAME = "libsherpa-onnx-jni.dylib") are copied directly from the compiled binary filenames and should not be changed to match other libraries' naming conventions.
📚 Learning: in sherpa-onnx java api, the native library names in core.java (win_native_library_name = "sherpa-on...
Learnt from: litongjava
PR: k2-fsa/sherpa-onnx#2440
File: sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/core/Core.java:4-6
Timestamp: 2025-08-06T04:18:47.981Z
Learning: In sherpa-onnx Java API, the native library names in Core.java (WIN_NATIVE_LIBRARY_NAME = "sherpa-onnx-jni.dll", UNIX_NATIVE_LIBRARY_NAME = "libsherpa-onnx-jni.so", MACOS_NATIVE_LIBRARY_NAME = "libsherpa-onnx-jni.dylib") are copied directly from the compiled binary filenames and should not be changed to match other libraries' naming conventions.

Applied to files:

  • sherpa-onnx/java-api/readme.md
  • sherpa-onnx/java-api/readme.zh.md
📚 Learning: the sherpa-onnx jni library files are stored in hugging face repository at https://huggingface.co/cs...
Learnt from: litongjava
PR: k2-fsa/sherpa-onnx#2440
File: sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/core/Core.java:4-6
Timestamp: 2025-08-06T04:23:50.237Z
Learning: The sherpa-onnx JNI library files are stored in Hugging Face repository at https://huggingface.co/csukuangfj/sherpa-onnx-libs under versioned directories like jni/1.12.7/, and the actual Windows JNI library filename is "sherpa-onnx-jni.dll" as defined in Core.java constants.

Applied to files:

  • sherpa-onnx/java-api/readme.md
  • sherpa-onnx/java-api/readme.zh.md
📚 Learning: windows 10 version 1809 and later, as well as windows 11, include built-in onnx runtime as part of w...
Learnt from: litongjava
PR: k2-fsa/sherpa-onnx#2440
File: sherpa-onnx/java-api/readme.md:35-36
Timestamp: 2025-08-06T04:18:28.680Z
Learning: Windows 10 version 1809 and later, as well as Windows 11, include built-in ONNX Runtime as part of Windows ML (WinML) exposed through Windows.AI.MachineLearning.dll and WinRT APIs. However, JNI libraries typically depend on standalone ONNX Runtime C/C++ libraries rather than the Windows ML API.

Applied to files:

  • sherpa-onnx/java-api/readme.md
  • sherpa-onnx/java-api/readme.zh.md
🪛 markdownlint-cli2 (0.17.2)
sherpa-onnx/java-api/readme.md

3-3: Emphasis used instead of a heading

(MD036, no-emphasis-as-heading)


35-35: Reversed link syntax
(run-onnx-models)[https://learn.microsoft.com/en-us/windows/ai/new-windows-ml/run-onnx-models]

(MD011, no-reversed-links)


114-114: Fenced code blocks should have a language specified

(MD040, fenced-code-language)


132-132: Fenced code blocks should have a language specified

(MD040, fenced-code-language)


212-212: Fenced code blocks should have a language specified

(MD040, fenced-code-language)

🪛 LanguageTool
sherpa-onnx/java-api/readme.zh.md

[grammar] ~100-~100: Ensure spelling is correct
Context: ... java.library.path 中找到本地库。 排查步骤: 1. 确认下载的是与你操作系统与架构匹配的版本(如 win-x64 vs arm64 等)。 2. 用绝对路径测试:将 `.dll...

(QB_NEW_EN_ORTHOGRAPHY_ERROR_IDS_1)

Comment on lines 32 to 36
#### 1. Windows 11

Starting from Windows 10 v1809 and all versions of Windows 11, the system comes with built-in ONNX Runtime as part of Windows ML (WinRT API), exposed through Windows.AI.MachineLearning.dll. You can directly use WinML to load and run ONNX models without additional downloads or installations.
(run-onnx-models)[https://learn.microsoft.com/en-us/windows/ai/new-windows-ml/run-onnx-models]

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

💡 Verification agent

🧩 Analysis chain

Clarify Windows “built-in ONNX Runtime” vs JNI requirement

Lines 32-36 imply that Windows 11 users can rely on the system WinML DLL and skip any extra downloads.
However, Sherpa-ONNX’s JNI layer links against the standalone ONNX Runtime C library, not WinML. End-users will still need the native onnxruntime.dll (or have it bundled in the JAR, per the new auto-loader design).

Please double-check and either:

  1. Remove this subsection, or
  2. Add an explicit note that WinML is not sufficient and that the packaged/bundled onnxruntime.dll is still required.

🏁 Script executed:

#!/bin/bash
set -e
echo "---- English readme (context around Windows 11) ----"
rg -n -C3 "#### 1\. Windows 11" sherpa-onnx/java-api/readme.md || true
echo ""
echo "---- Chinese readme (context around Windows 11) ----"
rg -n -C3 "#### 1\. Windows 11" sherpa-onnx/java-api/readme.zh.md || true

Length of output: 843


Clarify WinML vs. Sherpa-ONNX JNI requirement

The current Chinese README (sherpa-onnx/java-api/readme.zh.md lines 32–36) implies that Windows 10 v1809+/Windows 11’s built-in ONNX Runtime via Windows ML is sufficient to run Sherpa-ONNX. In reality, Sherpa-ONNX’s JNI layer links against the standalone ONNX Runtime C library (onnxruntime.dll), so end users must still ship or bundle that DLL (the auto-loader in Core.java only loads it, it doesn’t replace it).

Please update accordingly:

  • File: sherpa-onnx/java-api/readme.zh.md (lines 32–36)
    • Either remove this WinML subsection
    • Or add a clear note:

      注意:虽然 Windows ML (WinRT API) 内置了 ONNX Runtime,但 Sherpa-ONNX JNI 仍需独立的 onnxruntime.dll(可随 JAR 一并发布)。

• (Optional) Mirror the same clarification in the English README for consistency.

🤖 Prompt for AI Agents
In sherpa-onnx/java-api/readme.zh.md around lines 32 to 36, clarify that
although Windows 10 v1809+ and Windows 11 include ONNX Runtime via Windows ML,
Sherpa-ONNX JNI still requires the standalone onnxruntime.dll to be shipped or
bundled. Either remove the current WinML subsection or add a clear note stating
that Sherpa-ONNX JNI depends on the separate onnxruntime.dll, which must be
distributed with the JAR. Optionally, apply the same clarification to the
English README for consistency.

@csukuangfj
Copy link
Collaborator

By the way, I hope you can follow the approach used in onnxruntime. See also https://github.com/microsoft/onnxruntime/blob/main/java/src/main/java/ai/onnxruntime/OnnxRuntime.java

1

For instance, there is no need to use

public interface Core {
String WIN_NATIVE_LIBRARY_NAME = "libsherpa-onnx-jni.so";
String UNIX_NATIVE_LIBRARY_NAME = "libsherpa-onnx-jni.so";
String MACOS_NATIVE_LIBRARY_NAME = "libsherpa-onnx-jni.dylib";
}

You can use the function System.mapLibraryName("sherpa-onnx-jni") to do that.

See https://github.com/microsoft/onnxruntime/blob/b1546da6f524f4995ed3273648ebf1152f9f727a/java/src/main/java/ai/onnxruntime/OnnxRuntime.java#L458

2

Please replace part of the code from

with the code from
https://github.com/microsoft/onnxruntime/blob/b1546da6f524f4995ed3273648ebf1152f9f727a/java/src/main/java/ai/onnxruntime/OnnxRuntime.java#L118

3

Please follow

You can define

  • sherpa_onnx.native.path, if the directory specified by it exists, load sherpa-onnx-jni lib from it
  • otherwise, load sherpa-onnx-jni from the jar

4

Please reuse as much code as possible from onnxruntime. It is well tested. You don't need to reinvent the wheel.

请看这里

@csukuangfj
Copy link
Collaborator

下面是 解压 onnxruntime-1.22.0.jar 得到的部分目录结构。

Archive:  onnxruntime-1.22.0.jar
   creating: ai/
   creating: ai/onnxruntime/
  inflating: ai/onnxruntime/MapInfo.class  
   creating: ai/onnxruntime/native/
   creating: ai/onnxruntime/native/linux-aarch64/
  inflating: ai/onnxruntime/native/linux-aarch64/libonnxruntime.so  
  inflating: ai/onnxruntime/native/linux-aarch64/libonnxruntime4j_jni.so  
   creating: ai/onnxruntime/native/linux-x64/
  inflating: ai/onnxruntime/native/linux-x64/libonnxruntime.so  
  inflating: ai/onnxruntime/native/linux-x64/libonnxruntime4j_jni.so  
   creating: ai/onnxruntime/native/osx-aarch64/
  inflating: ai/onnxruntime/native/osx-aarch64/libonnxruntime.dylib  
   creating: ai/onnxruntime/native/osx-aarch64/libonnxruntime.dylib.dSYM/
   creating: ai/onnxruntime/native/osx-aarch64/libonnxruntime.dylib.dSYM/Contents/
  inflating: ai/onnxruntime/native/osx-aarch64/libonnxruntime.dylib.dSYM/Contents/Info.plist  
   creating: ai/onnxruntime/native/osx-aarch64/libonnxruntime.dylib.dSYM/Contents/Resources/
   creating: ai/onnxruntime/native/osx-aarch64/libonnxruntime.dylib.dSYM/Contents/Resources/DWARF/
  inflating: ai/onnxruntime/native/osx-aarch64/libonnxruntime.dylib.dSYM/Contents/Resources/DWARF/libonnxruntime.dylib  
   creating: ai/onnxruntime/native/osx-aarch64/libonnxruntime.dylib.dSYM/Contents/Resources/Relocations/
   creating: ai/onnxruntime/native/osx-aarch64/libonnxruntime.dylib.dSYM/Contents/Resources/Relocations/aarch64/
  inflating: ai/onnxruntime/native/osx-aarch64/libonnxruntime.dylib.dSYM/Contents/Resources/Relocations/aarch64/libonnxruntime.dylib.yml  
  inflating: ai/onnxruntime/native/osx-aarch64/libonnxruntime4j_jni.dylib  
   creating: ai/onnxruntime/native/osx-aarch64/libonnxruntime4j_jni.dylib.dSYM/
   creating: ai/onnxruntime/native/osx-aarch64/libonnxruntime4j_jni.dylib.dSYM/Contents/
  inflating: ai/onnxruntime/native/osx-aarch64/libonnxruntime4j_jni.dylib.dSYM/Contents/Info.plist  
   creating: ai/onnxruntime/native/osx-aarch64/libonnxruntime4j_jni.dylib.dSYM/Contents/Resources/
   creating: ai/onnxruntime/native/osx-aarch64/libonnxruntime4j_jni.dylib.dSYM/Contents/Resources/DWARF/
  inflating: ai/onnxruntime/native/osx-aarch64/libonnxruntime4j_jni.dylib.dSYM/Contents/Resources/DWARF/libonnxruntime4j_jni.dylib  
   creating: ai/onnxruntime/native/osx-aarch64/libonnxruntime4j_jni.dylib.dSYM/Contents/Resources/Relocations/
   creating: ai/onnxruntime/native/osx-aarch64/libonnxruntime4j_jni.dylib.dSYM/Contents/Resources/Relocations/aarch64/
  inflating: ai/onnxruntime/native/osx-aarch64/libonnxruntime4j_jni.dylib.dSYM/Contents/Resources/Relocations/aarch64/libonnxruntime4j_jni.dylib.yml  
   creating: ai/onnxruntime/native/osx-x64/
  inflating: ai/onnxruntime/native/osx-x64/libonnxruntime.dylib  
   creating: ai/onnxruntime/native/osx-x64/libonnxruntime.dylib.dSYM/
   creating: ai/onnxruntime/native/osx-x64/libonnxruntime.dylib.dSYM/Contents/
  inflating: ai/onnxruntime/native/osx-x64/libonnxruntime.dylib.dSYM/Contents/Info.plist  
   creating: ai/onnxruntime/native/osx-x64/libonnxruntime.dylib.dSYM/Contents/Resources/
   creating: ai/onnxruntime/native/osx-x64/libonnxruntime.dylib.dSYM/Contents/Resources/DWARF/
  inflating: ai/onnxruntime/native/osx-x64/libonnxruntime.dylib.dSYM/Contents/Resources/DWARF/libonnxruntime.dylib  
   creating: ai/onnxruntime/native/osx-x64/libonnxruntime.dylib.dSYM/Contents/Resources/Relocations/
   creating: ai/onnxruntime/native/osx-x64/libonnxruntime.dylib.dSYM/Contents/Resources/Relocations/x86_64/
  inflating: ai/onnxruntime/native/osx-x64/libonnxruntime.dylib.dSYM/Contents/Resources/Relocations/x86_64/libonnxruntime.dylib.yml  
  inflating: ai/onnxruntime/native/osx-x64/libonnxruntime4j_jni.dylib  
   creating: ai/onnxruntime/native/osx-x64/libonnxruntime4j_jni.dylib.dSYM/
   creating: ai/onnxruntime/native/osx-x64/libonnxruntime4j_jni.dylib.dSYM/Contents/
  inflating: ai/onnxruntime/native/osx-x64/libonnxruntime4j_jni.dylib.dSYM/Contents/Info.plist  
   creating: ai/onnxruntime/native/osx-x64/libonnxruntime4j_jni.dylib.dSYM/Contents/Resources/
   creating: ai/onnxruntime/native/osx-x64/libonnxruntime4j_jni.dylib.dSYM/Contents/Resources/DWARF/
  inflating: ai/onnxruntime/native/osx-x64/libonnxruntime4j_jni.dylib.dSYM/Contents/Resources/DWARF/libonnxruntime4j_jni.dylib  
   creating: ai/onnxruntime/native/osx-x64/libonnxruntime4j_jni.dylib.dSYM/Contents/Resources/Relocations/
   creating: ai/onnxruntime/native/osx-x64/libonnxruntime4j_jni.dylib.dSYM/Contents/Resources/Relocations/x86_64/
  inflating: ai/onnxruntime/native/osx-x64/libonnxruntime4j_jni.dylib.dSYM/Contents/Resources/Relocations/x86_64/libonnxruntime4j_jni.dylib.yml  
   creating: ai/onnxruntime/native/win-x64/
  inflating: ai/onnxruntime/native/win-x64/onnxruntime.dll  
  inflating: ai/onnxruntime/native/win-x64/onnxruntime.pdb  
  inflating: ai/onnxruntime/native/win-x64/onnxruntime4j_jni.dll  
  inflating: ai/onnxruntime/native/win-x64/onnxruntime4j_jni.pdb  
  inflating: ai/onnxruntime/NodeInfo.class  

我们可以有

  • com/k2fsa/sherpa/onnx/native/linux-aarch64/libsherpa-onnx-jni.so
  • com/k2fsa/sherpa/onnx/native/linux-x64/libsherpa-onnx-jni.so
  • com/k2fsa/sherpa/onnx/native/osx-aarch64/libsherpa-onnx-jni.dylib
  • com/k2fsa/sherpa/onnx/native/osx-x64/libsherpa-onnx-jni.dylib
  • com/k2fsa/sherpa/onnx/native/win-x64/sherpa-onnx-jni.dll
  • com/k2fsa/sherpa/onnx/native/win-aarch64/sherpa-onnx-jni.dll
  • com/k2fsa/sherpa/onnx/native/win-x86/sherpa-onnx-jni.dll

(不需要考虑 linux-x86. 没有与之对应的 onnxruntime lib 库. 也不需要考虑 linux-arm32)

还有这里。

改完后,请修改 这个 PR 添加的使用教程。不需要用户单独下载 动态库.

@litongjava
Copy link
Contributor Author

please review code

@csukuangfj
Copy link
Collaborator

请用 onnxruntime 里的代码,去检测 os 和 arch, 别自己写。

尽可能的利用 onnxruntime 的代码。希望能理解这个。

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 2

♻️ Duplicate comments (1)
sherpa-onnx/java-api/src/main/resources/readme.md (1)

15-15: Inconsistent platform naming with LibraryUtils.

The folder name windows_x64 doesn't match the platform identifier WIN_X64 = "win_x64" used in LibraryUtils.java.

This should be win_x64 to match the LibraryUtils constants.

🧹 Nitpick comments (6)
sherpa-onnx/java-api/src/main/resources/readme.md (2)

1-2: Fix typo in "download".

-please downlaod file and put in folder
+Please download files and put in folder

18-34: Add language specifiers to code blocks for better rendering.

-```
+```text
 .
 ├── .gitignore
 ├── native
 │   ├── darwin_arm64
 │   │   ├── libsherpa-onnx-jni.dylib
 │   ├── darwin_x64
 │   │   ├── libsherpa-onnx-jni.dylib
 │   ├── linux_arm64
 │   │   ├── libsherpa-onnx-jni.so
 │   ├── linux_x64
 │   │   ├── libsherpa-onnx-jni.so
 │   ├── win_x64
 │   │   ├── sherpa-onnx-jni.dll
 ├── readme.md
-
-```
+```
sherpa-onnx/java-api/readme.md (4)

3-3: Convert italic emphasis to proper heading.

-*Applicable to Windows / macOS / Linux (using Windows as an example for dynamic library loading)*
+## Applicable to Windows / macOS / Linux (using Windows as an example for dynamic library loading)

124-126: Add language specifier to code block.

-```
+```text
 https://k2-fsa.github.io/sherpa/onnx/tts/pretrained_models/kokoro.html

---

`141-148`: **Add language specifier to code block.**



```diff
-```
+```text
 LICENSE
 README.md
 espeak-ng-data/    # speech data directory
 model.onnx         # TTS model
 tokens.txt         # token mapping
 voices.bin         # voice embeddings

---

`222-228`: **Add language specifier to code block.**



```diff
-```
+```text
 -- elapsed           : 6.739 seconds
 -- audio duration    : 6.739 seconds
 -- real-time factor  : 0.563
 -- text              : ...
 -- Saved to          : tts-kokoro-en.wav

</blockquote></details>

</blockquote></details>

<details>
<summary>📜 Review details</summary>

**Configuration used: CodeRabbit UI**
**Review profile: CHILL**
**Plan: Pro**


<details>
<summary>📥 Commits</summary>

Reviewing files that changed from the base of the PR and between 0d2e14ccdbfdc0af22300e20caf6c3cd31e4ea49 and dc97e0023da98f3de900dff35f677d6987e6c54a.

</details>

<details>
<summary>📒 Files selected for processing (5)</summary>

* `sherpa-onnx/java-api/readme.md` (1 hunks)
* `sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/core/Core.java` (1 hunks)
* `sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/utils/LibraryUtils.java` (1 hunks)
* `sherpa-onnx/java-api/src/main/resources/.gitignore` (1 hunks)
* `sherpa-onnx/java-api/src/main/resources/readme.md` (1 hunks)

</details>

<details>
<summary>🧰 Additional context used</summary>

<details>
<summary>🧠 Learnings (4)</summary>

<details>
<summary>📓 Common learnings</summary>

Learnt from: litongjava
PR: #2440
File: sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/core/Core.java:4-6
Timestamp: 2025-08-06T04:23:50.237Z
Learning: The sherpa-onnx JNI library files are stored in Hugging Face repository at https://huggingface.co/csukuangfj/sherpa-onnx-libs under versioned directories like jni/1.12.7/, and the actual Windows JNI library filename is "sherpa-onnx-jni.dll" as defined in Core.java constants.


Learnt from: litongjava
PR: #2440
File: sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/core/Core.java:4-6
Timestamp: 2025-08-06T04:18:47.981Z
Learning: In sherpa-onnx Java API, the native library names in Core.java (WIN_NATIVE_LIBRARY_NAME = "sherpa-onnx-jni.dll", UNIX_NATIVE_LIBRARY_NAME = "libsherpa-onnx-jni.so", MACOS_NATIVE_LIBRARY_NAME = "libsherpa-onnx-jni.dylib") are copied directly from the compiled binary filenames and should not be changed to match other libraries' naming conventions.


</details>
<details>
<summary>📚 Learning: in sherpa-onnx java api, the native library names in core.java (win_native_library_name = "sherpa-on...</summary>

Learnt from: litongjava
PR: #2440
File: sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/core/Core.java:4-6
Timestamp: 2025-08-06T04:18:47.981Z
Learning: In sherpa-onnx Java API, the native library names in Core.java (WIN_NATIVE_LIBRARY_NAME = "sherpa-onnx-jni.dll", UNIX_NATIVE_LIBRARY_NAME = "libsherpa-onnx-jni.so", MACOS_NATIVE_LIBRARY_NAME = "libsherpa-onnx-jni.dylib") are copied directly from the compiled binary filenames and should not be changed to match other libraries' naming conventions.


**Applied to files:**
- `sherpa-onnx/java-api/src/main/resources/.gitignore`
- `sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/core/Core.java`
- `sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/utils/LibraryUtils.java`
- `sherpa-onnx/java-api/readme.md`
- `sherpa-onnx/java-api/src/main/resources/readme.md`

</details>
<details>
<summary>📚 Learning: the sherpa-onnx jni library files are stored in hugging face repository at https://huggingface.co/cs...</summary>

Learnt from: litongjava
PR: #2440
File: sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/core/Core.java:4-6
Timestamp: 2025-08-06T04:23:50.237Z
Learning: The sherpa-onnx JNI library files are stored in Hugging Face repository at https://huggingface.co/csukuangfj/sherpa-onnx-libs under versioned directories like jni/1.12.7/, and the actual Windows JNI library filename is "sherpa-onnx-jni.dll" as defined in Core.java constants.


**Applied to files:**
- `sherpa-onnx/java-api/src/main/resources/.gitignore`
- `sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/core/Core.java`
- `sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/utils/LibraryUtils.java`
- `sherpa-onnx/java-api/readme.md`
- `sherpa-onnx/java-api/src/main/resources/readme.md`

</details>
<details>
<summary>📚 Learning: windows 10 version 1809 and later, as well as windows 11, include built-in onnx runtime as part of w...</summary>

Learnt from: litongjava
PR: #2440
File: sherpa-onnx/java-api/readme.md:35-36
Timestamp: 2025-08-06T04:18:28.680Z
Learning: Windows 10 version 1809 and later, as well as Windows 11, include built-in ONNX Runtime as part of Windows ML (WinML) exposed through Windows.AI.MachineLearning.dll and WinRT APIs. However, JNI libraries typically depend on standalone ONNX Runtime C/C++ libraries rather than the Windows ML API.


**Applied to files:**
- `sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/core/Core.java`
- `sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/utils/LibraryUtils.java`
- `sherpa-onnx/java-api/readme.md`
- `sherpa-onnx/java-api/src/main/resources/readme.md`

</details>

</details><details>
<summary>🪛 markdownlint-cli2 (0.17.2)</summary>

<details>
<summary>sherpa-onnx/java-api/readme.md</summary>

3-3: Emphasis used instead of a heading

(MD036, no-emphasis-as-heading)

---

124-124: Fenced code blocks should have a language specified

(MD040, fenced-code-language)

---

141-141: Fenced code blocks should have a language specified

(MD040, fenced-code-language)

---

222-222: Fenced code blocks should have a language specified

(MD040, fenced-code-language)

</details>
<details>
<summary>sherpa-onnx/java-api/src/main/resources/readme.md</summary>

18-18: Fenced code blocks should have a language specified

(MD040, fenced-code-language)

</details>

</details>

</details>

<details>
<summary>🔇 Additional comments (5)</summary><blockquote>

<details>
<summary>sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/utils/LibraryUtils.java (3)</summary>

`27-46`: **LGTM - Well-structured native library loading logic.**

The two-tier approach (external directory first, then JAR extraction) provides good flexibility for users. The use of `System.mapLibraryName()` ensures correct platform-specific naming.

---

`84-95`: **Solid resource extraction with proper error handling.**

The directory creation logic and resource extraction are well-implemented with appropriate error handling for missing resources and I/O failures.

---

`104-114`: **Resource extraction method is robust.**

The method properly handles resource streams with try-with-resources and provides clear error messages for debugging.

</details>
<details>
<summary>sherpa-onnx/java-api/src/main/resources/.gitignore (1)</summary>

`1-2`: **LGTM - Appropriate exclusions for native library directories.**

Ignoring `lib/` and `native/` directories prevents extracted native libraries from being committed to version control, which aligns with the dynamic native library loading approach.

</details>
<details>
<summary>sherpa-onnx/java-api/readme.md (1)</summary>

`156-216`: **Excellent comprehensive Java example.**

The code example is well-structured and demonstrates proper usage of the Sherpa-ONNX Java API with clear resource management, timing measurements, and proper cleanup.

</details>

</blockquote></details>

</details>

<!-- This is an auto-generated comment by CodeRabbit for review status -->

Comment on lines 48 to 82
private static File init(String libFileName) {
String osName = System.getProperty("os.name").toLowerCase();
String osArch = System.getProperty("os.arch").toLowerCase();
String userHome = System.getProperty("user.home");
System.out.printf("Detected OS=%s, ARCH=%s, HOME=%s%n", osName, osArch, userHome);

String archName;
if (osName.contains("win")) {
if (osArch.contains("aarch64") || osArch.contains("arm")) {
archName = WIN_ARM64;
} else if (osArch.contains("64")) {
archName = WIN_X64;
} else {
archName = WIN_X86;
}

} else if (osName.contains("mac")) {
if (osArch.contains("aarch64") || osArch.contains("arm")) {
archName = DARWIN_ARM64;
} else {
archName = DARWIN_X64;
}

} else if (osName.contains("nix") || osName.contains("nux") || osName.contains("aix") || osName.contains("linux")) {
if (osArch.contains("aarch64") || osArch.contains("arm")) {
archName = LINUX_ARM64;
} else if (osArch.contains("64")) {
archName = LINUX_X64;
} else {
archName = LINUX_X86;
}

} else {
throw new UnsupportedOperationException("Unsupported OS: " + osName);
}
Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🛠️ Refactor suggestion

Platform detection logic is comprehensive but has inconsistent naming.

The platform detection covers all major platforms and architectures effectively. However, there's an inconsistency: the constant WIN_X64 suggests "win_x64" but the readme.md shows "windows_x64" in the folder structure.

Ensure consistency between the platform identifiers and the actual directory structure. Based on the constants defined at the top, update the readme.md to use:

  • win_x64 instead of windows_x64
  • win_x86 instead of windows_x86
  • win_arm64 instead of windows_arm64
🤖 Prompt for AI Agents
In
sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/utils/LibraryUtils.java
lines 48 to 82, the platform detection constants like WIN_X64 use names such as
"win_x64" which differ from the folder names "windows_x64" in readme.md. To fix
this, update the readme.md file to replace all occurrences of "windows_x64",
"windows_x86", and "windows_arm64" with "win_x64", "win_x86", and "win_arm64"
respectively, ensuring consistency between the code constants and documentation
folder names.

@litongjava
Copy link
Contributor Author

litongjava commented Aug 7, 2025

1.我的这个库和onnxruntime的加载库的逻辑并不完全相同
2.本项目仅仅依赖了onnxruntime的动态链接库文件 并没有依赖 onnxruntime的jar包
3.加载库的逻辑并不复杂,我的代码,我在windows,macos和linux上都测试通过了

@csukuangfj
Copy link
Collaborator

并没有需要依赖 onnxruntime jar.

只是说,检测 os 和 arch 的代码,从 onnxruntime 的代码里,直接 copy过来,别自己写。

@csukuangfj
Copy link
Collaborator

Screenshot 2025-08-07 at 12 05 24

这个逻辑是现成的, 我想不出理由,为什么不用。

如果非要自己写,请问你是每个平台都测试了么

@litongjava
Copy link
Contributor Author

Screenshot 2025-08-07 at 12 05 24 这个逻辑是现成的, 我想不出理由,为什么不用。

如果非要自己写,请问你是每个平台都测试了么

1.可以使用
2.不使用的主要主要原因是名称不一致,我代码中使用的名称是根据你编译的 压缩包的名称定的
3.修改为使用initOsArch 并并重名目录
4.我之前的旧代码,windows 64,linux 64和 macos arm 32我都测试了

@litongjava
Copy link
Contributor Author

修改完成,请review

Copy link

@coderabbitai coderabbitai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Actionable comments posted: 1

♻️ Duplicate comments (1)
sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/utils/LibraryUtils.java (1)

12-27: Platform constants are well-defined

The platform identifiers follow a consistent naming convention and cover all major supported platforms and architectures comprehensively.

Note: There was a previous concern about naming inconsistency between these constants (e.g., WIN_X64 = "win_x64") and directory structure shown in documentation ("windows_x64"). Ensure consistency between code constants and documentation.

📜 Review details

Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro

📥 Commits

Reviewing files that changed from the base of the PR and between be6142b and fcf7387.

📒 Files selected for processing (1)
  • sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/utils/LibraryUtils.java (1 hunks)
🧰 Additional context used
🧠 Learnings (4)
📓 Common learnings
Learnt from: litongjava
PR: k2-fsa/sherpa-onnx#2440
File: sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/core/Core.java:4-6
Timestamp: 2025-08-06T04:18:47.981Z
Learning: In sherpa-onnx Java API, the native library names in Core.java (WIN_NATIVE_LIBRARY_NAME = "sherpa-onnx-jni.dll", UNIX_NATIVE_LIBRARY_NAME = "libsherpa-onnx-jni.so", MACOS_NATIVE_LIBRARY_NAME = "libsherpa-onnx-jni.dylib") are copied directly from the compiled binary filenames and should not be changed to match other libraries' naming conventions.
Learnt from: litongjava
PR: k2-fsa/sherpa-onnx#2440
File: sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/core/Core.java:4-6
Timestamp: 2025-08-06T04:23:50.237Z
Learning: The sherpa-onnx JNI library files are stored in Hugging Face repository at https://huggingface.co/csukuangfj/sherpa-onnx-libs under versioned directories like jni/1.12.7/, and the actual Windows JNI library filename is "sherpa-onnx-jni.dll" as defined in Core.java constants.
📚 Learning: in sherpa-onnx java api, the native library names in core.java (win_native_library_name = "sherpa-on...
Learnt from: litongjava
PR: k2-fsa/sherpa-onnx#2440
File: sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/core/Core.java:4-6
Timestamp: 2025-08-06T04:18:47.981Z
Learning: In sherpa-onnx Java API, the native library names in Core.java (WIN_NATIVE_LIBRARY_NAME = "sherpa-onnx-jni.dll", UNIX_NATIVE_LIBRARY_NAME = "libsherpa-onnx-jni.so", MACOS_NATIVE_LIBRARY_NAME = "libsherpa-onnx-jni.dylib") are copied directly from the compiled binary filenames and should not be changed to match other libraries' naming conventions.

Applied to files:

  • sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/utils/LibraryUtils.java
📚 Learning: the sherpa-onnx jni library files are stored in hugging face repository at https://huggingface.co/cs...
Learnt from: litongjava
PR: k2-fsa/sherpa-onnx#2440
File: sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/core/Core.java:4-6
Timestamp: 2025-08-06T04:23:50.237Z
Learning: The sherpa-onnx JNI library files are stored in Hugging Face repository at https://huggingface.co/csukuangfj/sherpa-onnx-libs under versioned directories like jni/1.12.7/, and the actual Windows JNI library filename is "sherpa-onnx-jni.dll" as defined in Core.java constants.

Applied to files:

  • sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/utils/LibraryUtils.java
📚 Learning: windows 10 version 1809 and later, as well as windows 11, include built-in onnx runtime as part of w...
Learnt from: litongjava
PR: k2-fsa/sherpa-onnx#2440
File: sherpa-onnx/java-api/readme.md:35-36
Timestamp: 2025-08-06T04:18:28.680Z
Learning: Windows 10 version 1809 and later, as well as Windows 11, include built-in ONNX Runtime as part of Windows ML (WinML) exposed through Windows.AI.MachineLearning.dll and WinRT APIs. However, JNI libraries typically depend on standalone ONNX Runtime C/C++ libraries rather than the Windows ML API.

Applied to files:

  • sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/utils/LibraryUtils.java
🔇 Additional comments (4)
sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/utils/LibraryUtils.java (4)

1-11: LGTM - Clean imports and package structure

The package declaration follows Java conventions and imports are well-organized with appropriate use of the Core interface for constants.


28-47: Excellent native library loading strategy

The implementation follows best practices by:

  • Using System.mapLibraryName() for platform-appropriate naming
  • Implementing a clear fallback strategy (external directory → JAR extraction)
  • Using absolute paths with System.load() for reliable loading
  • Including helpful debugging output

77-96: Library initialization logic is sound

The method correctly:

  • Retrieves system properties (confirmed by previous discussion that these cannot be null)
  • Creates necessary directory structure with proper error handling
  • Constructs appropriate resource paths for extraction

Note: This method will work correctly once the separator inconsistency in initOsArch() is resolved.


98-116: Excellent resource extraction implementation

The method demonstrates best practices:

  • Proper resource management with try-with-resources
  • Null check for missing resources with descriptive error messages
  • Robust file copying with StandardCopyOption.REPLACE_EXISTING
  • Comprehensive exception handling

The Javadoc example correctly shows underscore separator format, confirming the expected resource path structure.

Comment on lines 49 to 75
/* Computes and initializes OS_ARCH_STR (such as linux-x64) */
private static String initOsArch() {
String detectedOS = null;
String os = System.getProperty("os.name", "generic").toLowerCase(Locale.ENGLISH);
if (os.contains("mac") || os.contains("darwin")) {
detectedOS = "osx";
} else if (os.contains("win")) {
detectedOS = "win";
} else if (os.contains("nux")) {
detectedOS = "linux";
} else {
throw new IllegalStateException("Unsupported os:" + os);
}
String detectedArch = null;
String arch = System.getProperty("os.arch", "generic").toLowerCase(Locale.ENGLISH);
if (arch.startsWith("amd64") || arch.startsWith("x86_64")) {
detectedArch = "x64";
} else if (arch.startsWith("x86")) {
// 32-bit x86 is not supported by the Java API
detectedArch = "x86";
} else if (arch.startsWith("aarch64")) {
detectedArch = "aarch64";
} else {
throw new IllegalStateException("Unsupported arch:" + arch);
}
return detectedOS + '-' + detectedArch;
}
Copy link

@coderabbitai coderabbitai bot Aug 7, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

⚠️ Potential issue

Fix naming inconsistencies in platform detection

The method has inconsistencies that will cause runtime failures:

  1. Separator mismatch: Returns "linux-x64" (dash) but constants use "linux_x64" (underscore)
  2. Architecture naming mismatch: Maps "aarch64" → "aarch64" but constants use "arm64"

Apply this diff to fix the inconsistencies:

-    return detectedOS + '-' + detectedArch;
+    // Convert architecture names to match constants
+    if ("aarch64".equals(detectedArch)) {
+      detectedArch = "arm64";
+    }
+    return detectedOS + '_' + detectedArch;
🤖 Prompt for AI Agents
In
sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/utils/LibraryUtils.java
between lines 49 and 75, fix the platform detection method by changing the
separator from dash to underscore in the returned string to match the constants,
and rename the architecture mapping from "aarch64" to "arm64" to align with the
constant naming. This will ensure consistent naming and prevent runtime
failures.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I've changed folder name

Copy link

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

🧩 Analysis chain

I'd like to verify the changes you made to ensure the naming inconsistencies are resolved.


🏁 Script executed:

#!/bin/bash
# Check the current implementation of LibraryUtils.java to see the changes
cat sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/utils/LibraryUtils.java

Length of output: 4528


@litongjava Thank you for the update! However, I can see that the core issues in the initOsArch() method still exist. The method on Line 74 still returns strings like "linux-aarch64" (with dash separator) while your constants use underscores like "linux_arm64".

The inconsistencies that will cause runtime failures are still present:

  1. Line 74: return detectedOS + '-' + detectedArch; uses dash but should use underscore
  2. Line 66: detectedArch = "aarch64"; should be "arm64" to match your constants

This means the method will return "linux-aarch64" but try to load from "/native/linux-aarch64/" while your resources are likely organized as "/native/linux_arm64/".

Could you please apply the fix to the initOsArch() method:

// Convert architecture names to match constants
if ("aarch64".equals(detectedArch)) {
  detectedArch = "arm64";
}
return detectedOS + '_' + detectedArch;

This will ensure the platform detection matches your folder structure and constants.


🧠 Learnings used
Learnt from: litongjava
PR: k2-fsa/sherpa-onnx#2440
File: sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/core/Core.java:4-6
Timestamp: 2025-08-06T04:18:47.981Z
Learning: In sherpa-onnx Java API, the native library names in Core.java (WIN_NATIVE_LIBRARY_NAME = "sherpa-onnx-jni.dll", UNIX_NATIVE_LIBRARY_NAME = "libsherpa-onnx-jni.so", MACOS_NATIVE_LIBRARY_NAME = "libsherpa-onnx-jni.dylib") are copied directly from the compiled binary filenames and should not be changed to match other libraries' naming conventions.

Learnt from: litongjava
PR: k2-fsa/sherpa-onnx#2440
File: sherpa-onnx/java-api/src/main/java/com/k2fsa/sherpa/onnx/core/Core.java:4-6
Timestamp: 2025-08-06T04:23:50.237Z
Learning: The sherpa-onnx JNI library files are stored in Hugging Face repository at https://huggingface.co/csukuangfj/sherpa-onnx-libs under versioned directories like jni/1.12.7/, and the actual Windows JNI library filename is "sherpa-onnx-jni.dll" as defined in Core.java constants.

Learnt from: litongjava
PR: k2-fsa/sherpa-onnx#2440
File: sherpa-onnx/java-api/readme.md:35-36
Timestamp: 2025-08-06T04:18:28.680Z
Learning: Windows 10 version 1809 and later, as well as Windows 11, include built-in ONNX Runtime as part of Windows ML (WinML) exposed through Windows.AI.MachineLearning.dll and WinRT APIs. However, JNI libraries typically depend on standalone ONNX Runtime C/C++ libraries rather than the Windows ML API.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants