Zum Hauptinhalt springen

SYSTEM AND METHOD FOR MAPPING FEATURES OF A WAREHOUSE ENVIRONMENT HAVING IMPROVED WORKFLOW

2024
Online Patent

Titel:
SYSTEM AND METHOD FOR MAPPING FEATURES OF A WAREHOUSE ENVIRONMENT HAVING IMPROVED WORKFLOW
Link:
Veröffentlichung: 2024
Medientyp: Patent
Sonstiges:
  • Nachgewiesen in: USPTO Patent Applications
  • Sprachen: English
  • Document Number: 20240160223
  • Publication Date: May 16, 2024
  • Appl. No: 18/510136
  • Application Filed: November 15, 2023
  • Claim: 1. A mapping system comprising: one or more processors; a first robot, in communication with at least one of said one or more processors, and including one or more data capture sensors; wherein said one or more processors are configured to support a mapping mode wherein said first robot is configured to be navigated through an environment to collect geospatial data using said one or more data capture sensors; wherein said one or more processors are configured to execute a Frontend N block for reading and processing said geospatial data from said one or more data capture sensors of said robot, and are configured to store said data in a keyframe object at a keyframe database; and wherein said one or more processors are configured to execute a Backend block capable of at least one selected from the group of detecting loop constraints, building submaps, and optimizing a pose graph using keyframe data from one or more trajectory blocks.
  • Claim: 2. The system of claim 1, wherein said one or more processors are further configured to support a localization mode wherein a map generated from said data is employed to regulate movement of a number of different robots.
  • Claim: 3. The system of claim 2, further comprising a second robot, wherein said second robot is an AMR configured to receive said map for permitting said second robot to navigate said environment.
  • Claim: 4. The system of claim 1, wherein at least one sensor of said one or more data capture sensors comprises a camera.
  • Claim: 5. The system of claim 1, wherein said first robot is an AMR.
  • Claim: 6. The system of claim 1, wherein said Frontend block is further configured to estimate visual odometry and build a keyframe.
  • Claim: 7. The system of claim 3, wherein said environment is a warehouse.
  • Claim: 8. The system of claim 2, wherein said first robot and said one or more processors are capable of permitting said map to be generated from said data in real time.
  • Claim: 9. The system of claim 1, wherein said Frontend N block comprises a block for at least one selected from the group of estimating visual odometry, extracting geometric features, and matching extracted features from a left image with a right image.
  • Claim: 10. The system of claim 9, wherein said block is configured to match extracted features from a left image with a right image using a patch-based method.
  • Claim: 11. The system of claim 1, wherein said Frontend N block comprises a neural network-based feature extractor.
  • Claim: 12. The system of claim 1, wherein said Frontend N block is configured to employ triangulation to estimate feature depth.
  • Claim: 13. The system of claim 1, wherein said Backend block is configured to perform relative pose estimation.
  • Claim: 14. The system of claim 13, wherein said Backend block is configured to perform relative pose estimation by executing a random sample consensus algorithm with a perspective n-point model technique.
  • Claim: 15. The system of claim 2, further comprising a trimmer block configured to ensure that in localization mode, memory usage remains constant.
  • Claim: 16. A mapping system comprising: one or more processors; a first robot, in communication with at least one of said one or more processors, and including one or more data capture sensors; wherein said one or more processors are configured to support a mapping mode wherein said first robot is configured to be navigated through a warehouse environment to collect geospatial data using said one or more data capture sensors; wherein said one or more processors are configured to execute a Frontend N block for reading and processing said geospatial data from said one or more data capture sensors of said robot, and are configured to store said data in a keyframe object at a keyframe database; wherein said one or more processors are configured to execute a Backend block capable of at least one selected from the group of detecting loop constraints, building submaps, and optimizing a pose graph using keyframe data from one or more trajectory blocks; wherein said one or more processors are further configured to support a localization mode wherein a map generated from said data is employed to regulate movement of a number of different robots; wherein at least one sensor of said one or more data capture sensors comprises a camera; and wherein said Frontend block is further configured to estimate visual odometry and build a keyframe.
  • Claim: 17. A mapping method comprising: providing one or more processors; providing a first robot, said first robot in communication with at least one of said one or more processors, and including one or more data capture sensors; configuring said one or more processors to support a mapping mode wherein said first robot is navigated through an environment to collect geospatial data using said one or more data capture sensors; configuring said one or more processors to execute a Frontend N block for reading and processing said geospatial data from said one or more data capture sensors of said robot, and configuring said one or more processors to store said data in a keyframe object at a keyframe database; and configuring said one or more processors to execute a Backend block capable of at least one selected from the group of detecting loop constraints, building submaps, and optimizing a pose graph using keyframe data from one or more trajectory blocks.
  • Claim: 18. The method of claim 16, further comprising configuring said one or more processors to support a localization mode wherein a map generated from said data is employed to regulate movement of a number of different robots.
  • Claim: 19. The method of claim 16, further comprising configuring said Frontend block to estimate visual odometry and build a keyframe.
  • Claim: 20. The method of claim 16, wherein said environment is a warehouse.
  • Current International Class: 05; 05; 05; 05; 05; 05; 05; 06

Klicken Sie ein Format an und speichern Sie dann die Daten oder geben Sie eine Empfänger-Adresse ein und lassen Sie sich per Email zusenden.

oder
oder

Wählen Sie das für Sie passende Zitationsformat und kopieren Sie es dann in die Zwischenablage, lassen es sich per Mail zusenden oder speichern es als PDF-Datei.

oder
oder

Bitte prüfen Sie, ob die Zitation formal korrekt ist, bevor Sie sie in einer Arbeit verwenden. Benutzen Sie gegebenenfalls den "Exportieren"-Dialog, wenn Sie ein Literaturverwaltungsprogramm verwenden und die Zitat-Angaben selbst formatieren wollen.

xs 0 - 576
sm 576 - 768
md 768 - 992
lg 992 - 1200
xl 1200 - 1366
xxl 1366 -