update wwws and documentation

This commit is contained in:
2025-08-01 09:36:49 +02:00
parent 63381d564d
commit 05a87c93fc
8 changed files with 133 additions and 69 deletions

80
GEMINI.md Normal file
View File

@@ -0,0 +1,80 @@
## Instructions pour Gemini sur le projet APXTRi
Ce fichier contient les règles et conventions spécifiques à ce projet.
- **Documentation API :** Chaque nouvelle route d'API ou modification de route doit être systématiquement documentée en utilisant le format `apiDoc`.
- **Fichiers de Langue :** Pour chaque nouvelle chaîne de texte visible par l'utilisateur (par exemple, les messages d'erreur ou de succès), les clés de traduction correspondantes doivent être ajoutées dans les fichiers `models/tplstrings/*_fr.json` et `models/tplstrings/*_en.json`.
- **Validation Visuelle :** Avant d'utiliser un outil pour modifier un fichier (`replace` ou `write_file`), afficher le code exact qui sera modifié (l'ancien et le nouveau) dans la console. Attendre ma confirmation visuelle avant de procéder à la modification.
- **Architecture du projet :** Un fichier .env qui indique ou se trouve le projet express NODEPATH, les data se trouvent dans DATAPATH et les log dans LOGPATH. CADDY est utilisé comme reverse proxy pour express et comme serveur de page web statique. apXtri dont le point d'entrée est apxtri.js est organisé en mode API restfull (CRUD) autour de gestion d'objets sous le controle de la routes/odmdb.js et /models/Odmdb.js qui s'appuie sur un schema/nom_objet.json qui decrit les propriétés (format schema-JSON 12-2020) et qui contient un identifieant unique apxid, des droits (accessrights) organisés autour de profil avec des droits C create, R read, U update, D delete basé sur l'appartenance du demandeur (via un header) à un profil particulier. Ce schema contient aussi une liste d'index permettant de trier, lister, classifier des identifiants. apXtri est une tribu qui permet à d'autres tribus de faire fonctionner de façon commune des objets privées à la tribu. Les objets sont stockés par nom de tribu puis par nom d'objet puis dans un repertoire itm chaque objet est enregistré dans un json nommé par la valeur de son apxid .json, chaque index est stocké avec son name.json dans le repertoire idx au même niveau que itm. Dans le cas ou des besoins de gestion des objets necessitent de contourner des droits ou de declencher des actions specifiques. Une tribu se trouve dans le repertoire NODEPATH avec son non unique et respecte la même logique de routes models que apxtri. Les data sont organisées de la même façon avec un repertoire nommé avec le nom de la tribu. Ces routes sont activées au moment du lancement d'apxtri.js. Pour eviter de t'alimenter avec trop de données GEMINI est lancé sur apxtowns/apxtri, un repertoire tmp/data/tribename/objects/wco et wwws va te permettre de pouvoir analyser et modifier ces fichiers html js css.
-**Specificité des objets wco et wwws :** Des fichiers statiques servis par Caddy sont stockés dans l'objet wwws, on trouve dans wwws/itm/appname.json la configuration de la webapp. Les fichiers de la web app de travail sont stockés dans wwws/appname/src et les fichiers optimisés sous le contrôle du models apxtri/models/Wwws.js sont stockés dans wwws/appname/dist. Caddy permet de rendre accessible sur le web /src/ et /dist/. Core Philosophy: Dynamic, Config-Driven UIs
The apXtri frontend architecture is not built like a traditional, static Single Page Application (SPA). Instead, it's a dynamic, configuration-driven system. The server assembles the entire context for a web page—including data, templates, and component logic—based on a series of JSON configuration files.
This approach allows for:
Extreme Reusability: wco components are self-contained and can be dropped into any wwws application.
Dynamic Page Composition: The layout and content of a page can be altered without deploying new code, simply by changing the JSON configuration.
Data-Centric Design: The system is centered around well-defined data models (schema) and local data instances (tpldata).
The Building Blocks: Key Files and Directories
wwws (Web Application)
Definition (/tmp/data/objects/wwws/itm/{appname}/{appname}.json): This is the manifest for a web application. Its most important property is pages, which defines the structure and component makeup of each page in the app.
Source Files (/tmp/data/objects/wwws/itm/{appname}/src/): This directory contains the application's "host" pages (like apxid_fr.html), which are simple HTML shells. It also contains the local data instances (tpldata) that customize wco components for this specific app.
wco (Web Component Object)
Definition (/tmp/data/objects/wco/itm/{wconame}.json): The manifest for a reusable component. It declares the component's templates (tpl), data models (tpldatamodel), and other metadata.
Business Logic (/tmp/data/objects/wco/itm/{wconame}/{wconame}.js): A server-side JavaScript file that can contain logic associated with the component.
Base Logic (/tmp/data/objects/wco/itm/apx/apx.js): A foundational JavaScript library that provides common functionalities (like data fetching, rendering, event handling) to all wco components, acting as a mini-framework.
Models and Routes
models/Wwws.js: The server-side NodeJS model responsible for interpreting the wwws and wco configurations. It's the brain of the operation.
routes/wwws.js: The Express.js router that exposes the functionality of Wwws.js as a set of API endpoints.
The End-to-End Workflow: From Request to Render
Let's trace the journey of a user requesting the apxid_fr.html page from the admin application.
Step 1: The "Installation" of a Component (A Developer Task)
This is a one-time setup action performed by a developer.
Developer Action: A developer decides to add the apxauth component to the admin app's apxid page.
API Call: They trigger a call to the /getwco/apxauth endpoint, providing the tribe, xapp=admin, and pagename=apxid in the query.
Wwws.getwco Execution:
The getwco function in models/Wwws.js is executed.
It reads the component's manifest (apxauth.json).
It reads the application's manifest (admin.json).
It merges the component's configuration (its templates, schemas, etc.) into the pages.apxid section of admin.json.
Crucially, it looks at the tpldatamodel defined in apxauth.json. For each data model, it creates a corresponding local data file in the application's directory (e.g., admin/src/tpldata/apxid_the-tag-id_apxauth.json). This file is a copy of the component's template data, ready for app-specific customization.
Result: The admin.json file is now updated. The apxauth component is officially "installed" on the apxid page.
Step 2: The User Page Load (A User Action)
Browser Request: The user navigates to https://admin.apxtri.farm.ants/src/apxid_fr.html. The browser loads this simple HTML file.
Initial JavaScript Execution: The apxid_fr.html file contains <script> tags. The core script, apx.js, is loaded. This script acts as the bootstrapper.
Bootstrapper Action: The apx.js script immediately makes an API call to fetch its configuration and data. It calls one of the updatelocaldb endpoints in routes/wwws.js (e.g., /updatelocaldb/the-tribe/admin/apxid/0).
Wwws.initlocaldata Execution:
The initlocaldata function in models/Wwws.js runs.
It reads the now-updated admin.json to understand the structure of the apxid page.
It sees that this page requires templates, data, and schemas from the apxauth component (and potentially others).
It proceeds to read all the required files from the disk in parallel:
Mustache templates (.mustache)
Customized local data (admin/src/tpldata/*.json)
Schema definitions (schema/*.json)
And other referenced data (itms, options, etc.)
It bundles all of this content into a single, large JSON object.
Data Payload Returned: The server responds to the updatelocaldb call with this large JSON object, which we'll call localData.
Frontend Rendering:
The apx.js script receives the localData object.
It now has everything it needs to render the page:
localData.tpl contains the Mustache template strings.
localData.tpldata contains the data to populate the templates.
localData.schema provides validation rules.
apx.js uses a Mustache rendering library to combine the templates and data, generating HTML.
This newly generated HTML is then injected into the DOM of the apxid_fr.html shell page, bringing the application to life.

View File

@@ -19,7 +19,7 @@ class ApxtriApp {
this.dataPath = `${process.env.DATAPATH}/data`;
this.nodePath = `${process.env.NODEPATH}/${process.env.TOWN}-${process.env.NATION}`;
}
/**
* Starts the application server.
*/
@@ -28,9 +28,10 @@ class ApxtriApp {
this._performPreflightChecks();
const conflist = this._discoverConfiguration();
await this._setupSharedSymlinks(conflist); // Now runs after and receives the config
await this._rebuildIndexes(conflist);
if (process.env.MODE==="dev") {
await this._setupSharedSymlinks(conflist); // Now runs after and receives the config
}
await this._rebuildIndexes(conflist);
const app = this._setupExpress(conflist);
await this._configureCaddy(conflist);
@@ -209,6 +210,8 @@ class ApxtriApp {
}
/**
* Usefull to do vibe codding in dev with gemini that have a limited files access
* it is not possioble to give gemini access to all data (use too much memory context)
* Ensures that `tmp/data` contains a mirrored directory for each tribe,
* with symlinks to the 'wco' and 'wwws' object directories inside an 'objects' folder.
* This version is idempotent and does not clean the directory, making it faster.

View File

@@ -5,24 +5,14 @@
const fs = require("fs-extra");
const path = require("path");
const logger = require('../utils/logger');
// --- Constants ---
const DATA_PATH = `${process.env.DATAPATH}/data`;
const LOG_CURRENT_MODULE = process.env.ACTIVELOG.split(',').includes('checkHeaders');
const REQUIRED_HEADERS = process.env.EXPOSEDHEADERS.split(',');
// --- Helper Functions ---
/**
* Logs messages to the console if logging is enabled for this module.
* @param {...any} args - The messages to log.
*/
const log = (...args) => {
if (LOG_CURRENT_MODULE) {
console.log("[checkHeaders]", ...args);
}
};
/**
* Collects required headers from the request object.
* @param {object} req - The Express request object.
@@ -37,7 +27,7 @@ const collectHeaders = (req) => {
req.headers.xlang = req.header("Content-Language");
}
log("Incoming headers:", req.headers);
logger.debug("Incoming headers:", req.headers);
for (const h of REQUIRED_HEADERS) {
const value = req.header(h);
@@ -48,7 +38,7 @@ const collectHeaders = (req) => {
}
}
if (req.headers.xhash) header.xhash=req.headers.xhash;
log("Collected session headers:", header);
logger.debug("Collected session headers:", header);
return { header, missingHeaders };
};
@@ -64,7 +54,7 @@ const isTribeIdValid = (header, appLocals) => {
const allowedTribes = ["town", "apxtri", ...appLocals.tribeids];
const isValid = allowedTribes.includes(xtribe);
if (!isValid) {
log(`Validation failed: Tribe ID '${xtribe}' not found in allowed list.`);
logger.warn(`Validation failed: Tribe ID '${xtribe}' not found in allowed list.`);
}
return isValid;
};
@@ -79,11 +69,11 @@ const normalizeLanguage = async (header) => {
try {
const config = await fs.readJson(confPath);
if (!config.api.languages.includes(header.xlang)) {
log(`Language '${header.xlang}' not supported, defaulting to 'en'.`);
logger.info(`Language '${header.xlang}' not supported, defaulting to 'en'.`);
header.xlang = "en";
}
} catch (error) {
console.error("[checkHeaders] Error reading tribe configuration:", error);
logger.error("Error reading tribe configuration:", error);
// Default to 'en' if config is unavailable
header.xlang = "en";
}
@@ -100,7 +90,7 @@ const checkHeaders = async (req, res, next) => {
req.session.header = header;
if (missingHeaders.length > 0) {
log("Request failed: Missing required headers.", missingHeaders);
logger.warn("Request failed: Missing required headers.", { missing: missingHeaders });
return res.status(400).json({
ref: "middlewares",
msg: "missing_headers",
@@ -120,9 +110,8 @@ const checkHeaders = async (req, res, next) => {
// Set default profiles; authenticated routes will overwrite this.
header.xprofils = ["anonymous"];
log("Header check passed for tribe:", header.xtribe);
logger.info("Header check passed for tribe:", header.xtribe);
next();
};
module.exports = checkHeaders;
module.exports = checkHeaders;

View File

@@ -7,26 +7,16 @@ const fs = require("fs-extra");
const dayjs = require("dayjs");
const path = require("path");
const openpgp = require("openpgp");
const logger = require('../utils/logger');
// --- Constants ---
const DATA_PATH = `${process.env.DATAPATH}/data`;
const TOKENS_DIR = path.join(DATA_PATH, "apxtri", "tmp", "tokens");
const TMP_DIR = path.join(DATA_PATH, "apxtri", "tmp");
const TOKEN_EXPIRATION_MS = 24 * 60 * 60 * 1000; // 24 hours
const LOG_CURRENT_MODULE = process.env.ACTIVELOG.split(',').includes('isAuthenticated');
// --- Helper Functions ---
/**
* Logs messages to the console if logging is enabled for this module.
* @param {...any} args - The messages to log.
*/
const log = (...args) => {
if (LOG_CURRENT_MODULE) {
console.log("[isAuthenticated]", ...args);
}
};
/**
* Cleans up expired token files and other temporary files once per day.
* This is triggered on the first authenticated request of a new day.
@@ -37,7 +27,7 @@ const cleanupExpiredTokens = async () => {
return;
}
log("Running daily cleanup of expired tokens and temporary files...");
logger.info("Running daily cleanup of expired tokens and temporary files...");
// Remove old markers
const oldMarkers = (await fs.readdir(TOKENS_DIR)).filter(f => f.startsWith('menagedone_'));
@@ -52,7 +42,7 @@ const cleanupExpiredTokens = async () => {
const timestamp = parseInt(parts[2], 10);
if (timestamp && (dayjs().valueOf() - timestamp > TOKEN_EXPIRATION_MS)) {
await fs.remove(path.join(TOKENS_DIR, file));
log(`Removed expired token: ${file}`);
logger.info(`Removed expired token: ${file}`);
}
}
@@ -89,10 +79,10 @@ const verifySignature = async (header, publicKeyArmored) => {
try {
const { xalias, xdays, xhash } = header;
const expectedMessage = `${xalias}_${xdays}`;
console.log("ffffffff",xhash)
const cleartextMessage = Buffer.from(xhash, "base64").toString();
if (!cleartextMessage.startsWith("-----BEGIN PGP SIGNED MESSAGE-----")) {
log("xhash is not a valid PGP signed message.");
logger.warn("xhash is not a valid PGP signed message.");
return false;
}
@@ -105,7 +95,7 @@ const verifySignature = async (header, publicKeyArmored) => {
});
if (!signatures || signatures.length === 0) {
log("Signature verification failed: No signatures found.");
logger.warn("Signature verification failed: No signatures found.");
return false;
}
@@ -116,12 +106,12 @@ const verifySignature = async (header, publicKeyArmored) => {
const isDataIntact = receivedText.trim() === expectedMessage;
if (!isDataIntact) {
log(`Signature data mismatch. Expected: "${expectedMessage}", Got: "${receivedText.trim()}"`);
logger.warn(`Signature data mismatch. Expected: "${expectedMessage}", Got: "${receivedText.trim()}"`);
}
return isDataIntact;
} catch (error) {
log("Signature verification failed:", error.message);
logger.error("Signature verification failed:", error.message);
return false;
}
};
@@ -144,7 +134,7 @@ const getUserProfiles = async (header) => {
profiles = [...new Set([...profiles, ...personInfo.profils])];
}
}
log(`Profiles for ${xalias}:`, profiles);
logger.debug(`Profiles for ${xalias}:`, profiles);
return profiles;
};
@@ -157,11 +147,11 @@ const isAuthenticated = async (req, res, next) => {
await cleanupExpiredTokens();
const { header } = req.session;
log("Authenticating request for alias:", header);
logger.debug("Authenticating request for alias:", header.xalias);
// 1. Handle anonymous user immediately
if (header.xalias === "anonymous" || !header.xhash || header.xhash === "anonymous") {
log("Anonymous user detected or no hash provided. No authentication required.");
logger.debug("Anonymous user detected or no hash provided. No authentication required.");
header.xprofils = ['anonymous'];
return next();
}
@@ -170,10 +160,10 @@ const isAuthenticated = async (req, res, next) => {
let hashToUse = (req.cookies && req.cookies.xauthhash) ? req.cookies.xauthhash : null;
if (!hashToUse || hashToUse === 'anonymous') {
log("No valid hash in cookie, falling back to 'xhash' header.");
logger.debug("No valid hash in cookie, falling back to 'xhash' header.");
hashToUse = header.xhash;
} else {
log("Using hash from 'xauthhash' cookie.");
logger.debug("Using hash from 'xauthhash' cookie.");
}
// Synchronize the header with the definitive hash for consistent downstream logic
@@ -181,7 +171,7 @@ const isAuthenticated = async (req, res, next) => {
// If no hash could be found in either place, authentication is not possible.
if (!hashToUse || hashToUse === 'anonymous') {
log("No authentication hash found in header or cookie. Access denied.");
logger.warn("No authentication hash found in header or cookie. Access denied.");
return res.status(401).json({ ref: "middlewares", msg: "authentication_hash_missing" });
}
@@ -190,12 +180,12 @@ const isAuthenticated = async (req, res, next) => {
// 3. Fast Path: Check for an existing, valid token file
if (await fs.pathExists(tokenPath)) {
header.xprofils = await fs.readJson(tokenPath);
log(`Authenticated via existing token file for ${header.xalias}.`);
logger.info(`Authenticated via existing token file for ${header.xalias}.`);
return next();
}
// 4. Slow Path: Verify PGP signature
log("No valid token file found. Proceeding with PGP signature verification.");
logger.info("No valid token file found. Proceeding with PGP signature verification.");
const paganFilePath = path.join(DATA_PATH, "apxtri", "objects", "pagans", "itm", `${header.xalias}.json`);
if (!(await fs.pathExists(paganFilePath))) {
@@ -215,7 +205,7 @@ const isAuthenticated = async (req, res, next) => {
}
// 4. Signature is valid: Grant access, create token, and set cookie
log("Signature verified successfully.");
logger.info("Signature verified successfully.");
header.xprofils = await getUserProfiles(header);
await fs.outputJson(tokenPath, header.xprofils);
@@ -228,8 +218,8 @@ const isAuthenticated = async (req, res, next) => {
};
res.cookie('xauthhash', header.xhash, cookieOptions);
log(`Authentication successful for ${header.xalias}. Token created and cookie set.`);
logger.info(`Authentication successful for ${header.xalias}. Token created and cookie set.`);
next();
};
module.exports = isAuthenticated;
module.exports = isAuthenticated;

View File

@@ -51,7 +51,7 @@ class Caddy {
headers: { 'Content-Type': 'application/json' }
});
log("✅ Caddy configuration updated successfully.");
this._updatePermissions();
//this._updatePermissions();
} catch (error) {
console.error("❌ Caddy reload error:", error.response?.data || error.message);
}
@@ -156,14 +156,14 @@ class Caddy {
* Updates file permissions to ensure Caddy can serve the files.
* @private
*/
static _updatePermissions() {
const cmd = `find "${DATA_PATH}" -type d -path "*/objects/wwws" -exec chmod o+rx {} +`;
/* static _updatePermissions() {
const cmd = `find "${DATA_PATH}" -type d -path " * /objects/wwws" -exec chmod o+rx {} +`;
exec(cmd, (error, stdout, stderr) => {
if (error) console.error(`❌ Exec error updating permissions: ${error.message}`);
if (stderr) console.error(`Stderr updating permissions: ${stderr}`);
log("Caddy file permissions updated.");
});
}
} */
}
module.exports = Caddy;

View File

@@ -400,13 +400,15 @@ class Odmdb {
return { status: 404, ref: "Odmdb", msg: "index_name_not_found_in_schema" };
}
// Check for access rights.
const accessrights = indexConfig.accessrights || [];
const userProfils = role.xprofils || [];
const hasPermission = userProfils.some(profil => accessrights.includes(profil));
// Check for access rights. If accessrights is not defined or is an empty array, the index is public.
const accessrights = indexConfig.accessrights;
if (accessrights && accessrights.length > 0) {
const userProfils = role.xprofils || [];
const hasPermission = userProfils.some(profil => accessrights.includes(profil));
if (!hasPermission) {
return { status: 403, ref: "Odmdb", msg: "permission_denied_for_index" };
if (!hasPermission) {
return { status: 403, ref: "Odmdb", msg: "permission_denied_for_index" };
}
}
// If permission is granted, read and return the index file.

View File

@@ -101,10 +101,12 @@ class Wwws {
const headers = (appConf.apxtri && appConf.apxtri.headers) ? appConf.apxtri.headers : {};
const localData = {
version: pageConf.version,
town:process.env.TOWN,
nation:process.env.NATION,
headers: { ...headers, xlang: lg },
confpage: pageConf.confpage,
itm: {}, itms: {}, options: {}, tpl: {}, tpldata: {},
ref: {}, schema: {}, screens: {},wcodata:pageConf.wcodata,appdata:pageConf.appdata
ref: {}, schema: {}, screens: {},wco:pageConf.wco,appdata:pageConf.appdata
};
const promises = [];

View File

@@ -404,18 +404,16 @@ router.post("/search/:tribe/:objectname", checkHeaders, isAuthenticated, async (
* @apiError (404 Not Found) IndexNotFound The requested index does not exist or is not defined in the schema.
* @apiError (500 Internal Server Error) InternalServerError An unexpected error occurred on the server.
*/
router.get("/idx/:objectname/:idxname", checkHeaders, isAuthenticated, async (req, res) => {
router.get("/idx/:tribe/:objectname/:idxname", checkHeaders, isAuthenticated, async (req, res) => {
try {
const { objectname, idxname } = req.params;
// For this specific route, the tribe is hardcoded to 'apxtri' as per the request.
const tribe = 'apxtri';
const { tribe, objectname, idxname } = req.params;
const db = new Odmdb(objectname, tribe);
const role = req.session.header; // The role is populated by isAuthenticated middleware
const result = await db.getidx(idxname, role);
res.status(result.status).json(result);
} catch (error) {
handleError(res, error, `GET /idx/${req.params.objectname}/${req.params.idxname}`);
handleError(res, error, `GET /idx/${req.params.tribe}/${req.params.objectname}/${req.params.idxname}`);
}
});