CodePen项目本地化部署指南

CodePen项目本地化部署指南

本文详细介绍了将codepen上的mediapipe facelandmarker项目成功部署到本地环境的步骤。通过整合htmlcssjavascript代码到一个单一文件,并修正外部资源引用路径,解决了本地运行时遇到的问题,确保项目功能完整运行,为开发者提供了在本地调试和优化codepen项目的实用指南。

前端开发中,CodePen等在线代码编辑器为快速原型开发和分享提供了极大便利。然而,当需要将这些项目迁移到本地环境进行更深入的开发、调试或部署时,可能会遇到一些挑战,尤其是当项目依赖于特定的构建流程或外部资源时。本文将以一个MediaPipe FaceLandmarker项目为例,详细阐述如何将一个CodePen项目成功地在本地运行起来。

核心问题分析

将CodePen项目直接复制到本地文件系统通常无法正常工作,主要原因有以下几点:

  1. css处理器依赖: CodePen支持sassless等CSS预处理器。例如,@use “@material”; 这样的语法需要一个构建步骤来编译成标准CSS。直接复制到本地,浏览器无法识别。
  2. javaScript模块导入机制: CodePen可能通过其内部机制或隐式地支持ES模块导入,但本地html文件直接使用 import … from “…” 语法时,需要确保导入路径是可解析的,并且通常需要完整的CDN链接或本地文件路径。
  3. 资源路径问题: 许多CodePen项目依赖于外部CDN托管的库、字体、图片或机器学习模型文件。在本地运行时,这些路径必须是绝对的、可访问的URL。
  4. 文件结构与链接: CodePen通常将HTML、CSS和js分为独立面板。本地运行时,需要将它们正确地整合到一个HTML文件中,并确保CSS和JS文件被正确链接或内联。

本地化部署步骤

要成功在本地运行CodePen项目,我们需要对代码进行以下调整:

1. 整合代码结构

最简单且直接的方式是将HTML、CSS和javascript代码整合到一个单一的 index.html 文件中。

  • 将CSS代码放入 <style> 标签内,放置在 <head> 部分。
  • 将JavaScript代码放入 <script type=”module”> 标签内,放置在 <body> 结束标签之前。

2. 处理CSS预处理器与外部样式

原CodePen项目中的CSS可能包含 @use “@material”; 这样的Sass语法。在本地环境中,如果没有Sass编译器,浏览器将无法解析。解决方案是:

CodePen项目本地化部署指南

笔目鱼英文论文写作器

写高质量英文论文,就用笔目鱼

CodePen项目本地化部署指南 87

查看详情 CodePen项目本地化部署指南

  • 移除预处理器语法: 将 @use “@material”; 这样的行移除。
  • 引入标准CSS库: 如果项目依赖Material Design组件库,需要通过CDN链接引入其编译后的CSS文件。例如,在 <head> 中添加:
    <link href="https://unpkg.com/material-components-web@latest/dist/material-components-web.min.css" rel="stylesheet">

3. 修正JavaScript模块导入

CodePen中的JavaScript模块导入可能依赖于其环境。在本地运行ES模块时,需要确保导入路径是有效的URL。

  • 使用完整的CDN路径: 将 import vision from “@mediapipe/tasks-vision”; 这样的导入语句替换为完整的CDN URL。例如:
    import vision from "https://cdn.jsdelivr.net/npm/@mediapipe/tasks-vision@0.10.0";

    请注意,原始答案中的 [email protected] 是经过邮件混淆处理的,实际应替换为 tasks-vision@0.10.0。

  • 修正wasm文件路径: MediaPipe等库通常需要加载WebAssembly (WASM) 文件。FilesetResolver.forVisionTasks() 方法需要正确的WASM文件目录路径。确保其指向正确的CDN路径:
    const filesetResolver = await FilesetResolver.forVisionTasks(   "https://cdn.jsdelivr.net/npm/@mediapipe/tasks-vision@0.10.0/wasm" );

4. 确保所有外部资源可访问

检查HTML、CSS和JavaScript中所有对外部资源的引用,如图片 (<img> 的 src)、模型文件 (modelAssetPath) 等,确保它们都使用完整的、可公开访问的URL。

完整示例代码

以下是经过上述修正后,可在本地直接运行的MediaPipe FaceLandmarker项目代码:

 <html> <head>   <meta charset="utf-8">   <meta http-equiv="Cache-control" content="no-cache, no-store, must-revalidate">   <meta http-equiv="Pragma" content="no-cache">   <meta name="viewport" content="width=device-width, initial-scale=1, user-scalable=no">   <title>Face Landmarker</title>    <style>     /* 移除 @use "@material"; */     body {       font-family: helvetica, arial, sans-serif;       margin: 2em;       color: #3d3d3d;       --mdc-theme-primary: #007f8b;       --mdc-theme-on-primary: #f1f3f4;     }      h1 {       font-style: italic;       color: #ff6f00;       color: #007f8b;     }      h2 {       clear: both;     }      em {       font-weight: bold;     }      video {       clear: both;       display: block;       transform: rotateY(180deg);       -webkit-transform: rotateY(180deg);       -moz-transform: rotateY(180deg);     }      section {       opacity: 1;       transition: opacity 500ms ease-in-out;     }      header,     footer {       clear: both;     }      .removed {       display: none;     }      .invisible {       opacity: 0.2;     }      .note {       font-style: italic;       font-size: 130%;     }      .videoView,     .detectOnClick,     .blend-shapes {       position: relative;       float: left;       width: 48%;       margin: 2% 1%;       cursor: pointer;     }      .videoView p,     .detectOnClick p {       position: absolute;       padding: 5px;       background-color: #007f8b;       color: #fff;       border: 1px dashed rgba(255, 255, 255, 0.7);       z-index: 2;       font-size: 12px;       margin: 0;     }      .highlighter {       background: rgba(0, 255, 0, 0.25);       border: 1px dashed #fff;       z-index: 1;       position: absolute;     }      .canvas {       z-index: 1;       position: absolute;       pointer-events: none;     }      .output_canvas {       transform: rotateY(180deg);       -webkit-transform: rotateY(180deg);       -moz-transform: rotateY(180deg);     }      .detectOnClick {       z-index: 0;     }      .detectOnClick img {       width: 100%;     }      .blend-shapes-item {       display: flex;       align-items: center;       height: 20px;     }      .blend-shapes-label {       display: flex;       width: 120px;       justify-content: flex-end;       align-items: center;       margin-right: 4px;     }      .blend-shapes-value {       display: flex;       height: 16px;       align-items: center;       background-color: #007f8b;     }   </style>    <link href="https://unpkg.com/material-components-web@latest/dist/material-components-web.min.css" rel="stylesheet">   <script src="https://unpkg.com/material-components-web@latest/dist/material-components-web.min.js"></script> </head> <body>   <h1>Face landmark detection using the MediaPipe FaceLandmarker task</h1>    <section id="demos" class="tuc-97a49982-c2a5a0-0 invisible tuc-97a49982-c2a5a0-0">     <h2>Demo: Detecting Images</h2>     <p><b>Click on an image below</b> to see the key landmarks of the face.</p>      <div class="tuc-97a49982-c2a5a0-0 detectOnClick tuc-97a49982-c2a5a0-0">       <img src="https://storage.googleapis.com/mediapipe-assets/portrait.jpg" width="100%" crossorigin="anonymous" title="Click to get detection!" />     </div>     <div class="tuc-97a49982-c2a5a0-0 blend-shapes tuc-97a49982-c2a5a0-0">       <ul class="tuc-97a49982-c2a5a0-0 blend-shapes-list tuc-97a49982-c2a5a0-0" id="image-blend-shapes"></ul>     </div>      <h2>Demo: Webcam continuous face landmarks detection</h2>     <p>Hold your face in front of your webcam to get real-time face landmarker detection.</br>Click <b>enable webcam</b> below and grant access to the webcam if prompted.</p>      <div id="liveView" class="tuc-97a49982-c2a5a0-0 videoView tuc-97a49982-c2a5a0-0">       <button id="webcamButton" class="tuc-97a49982-c2a5a0-0 mdc-button mdc-button--raised tuc-97a49982-c2a5a0-0">         <span class="tuc-97a49982-c2a5a0-0 mdc-button__ripple tuc-97a49982-c2a5a0-0"></span>         <span class="tuc-97a49982-c2a5a0-0 mdc-button__label tuc-97a49982-c2a5a0-0">ENABLE WEBCAM</span>       </button>       <div style="position: relative;">         <video id="webcam" style="position: abso" autoplay playsinline></video>         <canvas class="tuc-97a49982-c2a5a0-0 output_canvas tuc-97a49982-c2a5a0-0" id="output_canvas" style="position: absolute; left: 0px; top: 0px;"></canvas>       </div>     </div>     <div class="tuc-97a49982-c2a5a0-0 blend-shapes tuc-97a49982-c2a5a0-0">       <ul class="tuc-97a49982-c2a5a0-0 blend-shapes-list tuc-97a49982-c2a5a0-0" id="video-blend-shapes"></ul>     </div>   </section>    <script type="module">     import vision from "https://cdn.jsdelivr.net/npm/@mediapipe/tasks-vision@0.10.0";      const { FaceLandmarker, FilesetResolver, DrawingUtils } = vision;     const demosSection = document.getElementById("demos");     const imageBlendShapes = document.getElementById("image-blend-shapes");     const videoBlendShapes = document.getElementById("video-blend-shapes");      let faceLandmarker;     let runningMode = "IMAGE"; // Corrected type annotation syntax     let enableWebcamButton; // Corrected type annotation syntax     let webcamRunning = false; // Corrected type annotation syntax     const videoWidth = 480;      async function runDemo() {       const filesetResolver = await FilesetResolver.forVisionTasks(         "https://cdn.jsdelivr.net/npm/@mediapipe/tasks-vision@0.10.0/wasm"       );       faceLandmarker = await FaceLandmarker.createFromOptions(filesetResolver, {         baseoptions: {           modelAssetPath: `https://storage.googleapis.com/mediapipe-models/face_landmarker/face_landmarker/float16/1/face_landmarker.task`,           delegate: "GPU"         },         outputFaceBlendshapes: true,         runningMode,         numFaces: 1       });       demosSection.classList.remove("invisible");     }     runDemo();      const imageContainers = document.getElementsByClassName("detectOnClick");      for (let i = 0; i < imageContainers.length; i++) {       imageContainers[i].children[0].addEventListener("click", handleClick);     }      async function handleClick(event) {       if (!faceLandmarker) {         console.log("Wait for faceLandmarker to load before clicking!");         return;       }        if (runningMode === "VIDEO") {         runningMode = "IMAGE";         await faceLandmarker.setOptions({ runningMode });       }        const allCanvas = event.target.parentnode.getElementsByClassName("canvas");       for (var i = allCanvas.length - 1; i >= 0; i--) {         const n = allCanvas[i];         n.parentNode.removeChild(n);       }        const faceLandmarkerResult = faceLandmarker.detect(event.target);       const canvas = document.createElement("canvas");       canvas.setAttribute("class", "canvas");       canvas.setAttribute("width", event.target.naturalWidth + "px");       canvas.setAttribute("height", event.target.naturalHeight + "px");       canvas.style.left = "0px";       canvas.style.top = "0px";       canvas.style.width = `${event.target.width}px`;       canvas.style.height = `${event.target.height}px`;        event.target.parentNode.appendChild(canvas);       const ctx = canvas.getContext("2d");       const drawingUtils = new DrawingUtils(ctx);       for (const landmarks of faceLandmarkerResult.faceLandmarks) {         drawingUtils.drawConnectors(           landmarks,           FaceLandmarker.FACE_LANDMARKS_TESSELATION,           { color: "#C0C0C070", lineWidth: 1 }         );         drawingUtils.drawConnectors(           landmarks,           FaceLandmarker.FACE_LANDMARKS_RIGHT_EYE,           { color: "#FF3030" }         );         drawingUtils.drawConnectors(           landmarks,           FaceLandmarker.FACE_LANDMARKS_RIGHT_EYEBROW,           { color: "#FF3030" }         );         drawingUtils.drawConnectors(           landmarks,           FaceLandmarker.FACE_LANDMARKS_LEFT_EYE,           { color: "#30FF30" }         );         drawingUtils.drawConnectors(           landmarks,           FaceLandmarker.FACE_LANDMARKS_LEFT_EYEBROW,           { color: "#30FF30" }         );         drawingUtils.drawConnectors(           landmarks,           FaceLandmarker.FACE_LANDMARKS_FACE_OVAL,           { color: "#E0E0E0" }         );         drawingUtils.drawConnectors(landmarks, FaceLandmarker.FACE_LANDMARKS_LIPS, {           color: "#E0E0E0"         });         drawingUtils.drawConnectors(           landmarks,           FaceLandmarker.FACE_LANDMARKS_RIGHT_IRIS,           { color: "#FF3030" }         );         drawingUtils.drawConnectors(           landmarks,           FaceLandmarker.FACE_LANDMARKS_LEFT_IRIS,           { color: "#30FF30" }         );       }       drawBlendShapes(imageBlendShapes, faceLandmarkerResult.faceBlendshapes);     }      const video = document.getElementById("webcam");     const canvasElement = document.getElementById("output_canvas");     const canvasCtx = canvasElement.getContext("2d");      function hasGetUserMedia() {       return !!(navigator.mediaDevices && navigator.mediaDevices.getUserMedia);     }      if (hasGetUserMedia()) {       enableWebcamButton = document.getElementById("webcamButton");       enableWebcamButton.addEventListener("click", enableCam);     } else {       console.warn("getUserMedia() is not supported by your browser");     }      function enableCam(event) {       if (!faceLandmarker) {         console.log("Wait! faceLandmarker not loaded yet.");         return;       }        if (webcamRunning === true) {         webcamRunning = false;         enableWebcamButton.innerText = "ENABLE PREDICTIONS";       } else {         webcamRunning = true;         enableWebcamButton.innerText = "DISABLE PREDICTIONS";       }        const constraints = {         video: true       };        navigator.mediaDevices

上一篇
下一篇
text=ZqhQzanResources