1
0
forked from apxtri/apxtri
This commit is contained in:
philc 2024-04-12 12:49:48 +02:00
parent 4c959380b6
commit 52fcf0d9aa
9 changed files with 221 additions and 187 deletions

View File

@ -18,14 +18,10 @@ See [apxtri web site](https://apxtri.crabdance.com) how to create a new social w
All actors will have the same target to respect contracts and are free to leave or to stay into a nation, a town or a tribe. If a contract is not fair, then a nation, tribe, will be empty mean not creating value mean die. Only fair rules will survive but any try will rewards actor with XTRIB coin.
```plaintext
/town-nation/
/town-nation/ tribe sapce
/conf.json town conf express.js
/tmp/tokens temporary files
/tribes/ tribe's space
/idx/
/itm/admin.json tribe conf are store
tribeName.json
/adminapi/ Allow apxtribe management
/adminapi/ Space to manage tribes
/apxtri/ core system that manage used tribe adminapi referential endepoint https://.../api/routeName/...
/middlewares
/models
@ -35,15 +31,22 @@ All actors will have the same target to respect contracts and are free to leave
/nginx/adminapi_adminapx.conf nginx conf per website
/schema/ list of schema for ObjectName.json title description coments are in english
/lg list of schema ObjectName_lg.json per language data (same structure than ObjectName.json but in language lg)
/objects/objectName/idx/ list of indexName.json
/objects/objectName/
/idx/ list of indexName.json
/itm/ list of object content store by apxid.json (unique key to identify an ite in a collection of items object
/pagans Unique numeric ID shared accross all node (towns)
/towns Unique town name shared accross all node by domain name + IP
/
/www/cdn/ web public access file
/share/apxtriVx.tar derniere version apxTri pour installation
/conf.json Version list and schema link that define this object
some key objects
/pagans/ Unique numeric ID shared accross all node (towns)
/towns/ Unique town name shared accross all node by domain name + IP
/...
/wwws/ Webspace that can be served with nginx
cdn/ web public access file
/adminapx/index_lg.html administration webapp
/website/
/idx/
/itm/adminapi.json tribe conf admin
tribeName.json tribeName conf
/tribeName/
/apxtri/routes Specific web service https://.../api/tribename/routeName/endpoint
/middlewares

View File

@ -2,54 +2,20 @@
api documentation for routes and middleware has to respect apidoc's rules [https://apidocjs.com/](https://apidocjs.com) 
To update this doc accessible in [https://wal-ants.ndda.fr/cdn/apidoc](https://wal-ants.ndda.fr/cdn/apidoc) :
To update this doc accessible in [https://wal-ants.ndda.fr/apidoc](https://wal-ants.ndda.fr/cdn/apidoc) :
`yarn apidoc` 
 `$ tribe=adminapi yarn apidoc` 
For api tribe's doc  accessible in [https://smatchit.io/cdn/apidoc](https://smatchit.io/cdn/apidoc) [:](https://smatchit.io/cdn/apidoc:) 
For api tribe's doc  accessible in [https://admin.smatchit.io/apidoc](https://smatchit.io/cdn/apidoc) [:](https://smatchit.io/cdn/apidoc:) 
`yarn apidoctribename`
`$ tribe=smatchit yarn apidoc`
A special tribe call adminapi is replicated in any towns (node), it works the same than all the other tribe except that all their data are synchronize with a blockchain 
To get overview check README.md project and the package.json [https://gitea.ndda.fr/apxtri/apxtri](https://gitea.ndda.fr/apxtri/apxtri)
A special tribe call adminapi in any towns (node), it works the same than all the other tribe except that all their data are synchronize with a blockchain 
Objects manage by adminapi are: pagans (numerique id =alias/public key / private key), notifications (cypher message betxeen alias) , nations (rules apply to all towns belonging to a nations), towns ( a server that host IT ressources disk space, ram, bandwith and rules aplly to all tribe belonging to a town), tribes (( a sharing space to store data as well as api with rules to any person that use it), wwws (web space, dns)
All others object are managed by spécifics tribe. 
```plaintext
/townName_nationName/
/conf/nginx/tribename_appname.conf # nginx conf
/conf/apidoc # apidoc conf
/conf/townconf.json # town settings contain all glabl parameter
/tribes/idx/triebid_all.json # A global file {tribename:{conf}
/itm/tribename.json # Config file of a tribe
/adminapi # Tribes synchronize with all town
/apxtri # git yarn/npm project package.json entry point apxtri.js
/routes/
/models/
/middlewares/
/logs/nginx # nginx log related to /conf/nginx/apxtri_adminapi.conf
/api
/objects/objectname/idx/ # list of index to search objectname items
/itms/ # 1 json per items name apxid.json where apxid is a unique key
/wwws/idx/
/itm/
appname.json # website appname conf
cdn.json
/appname/ # website files
/cdn/ # cached files to optimize nginx static file delivery
/schema/conf.json # list of schema and version
/objectname.json # schema title and escription are in english
/lg/objectname_lg.json # title and description in lg
/tribename/ # same than adminapi for a specific trib,
# we only have 1 node process that manage 1 town that manage many tribes api
```
API Endpoint url: **/api/{tribename}/{routename}/xxx**
Domaine name can be a adminapi donaim name aswell any tribe's domain  name. Check nginx conf in /conf/nginx 
## Object management (Odmdb)
An object has a name and is defined by a schema that contain properties key.
@ -57,34 +23,57 @@ An object has a name and is defined by a schema that contain properties key.
A propertie has a name and a list of caracteristics (type, pattern,format,...) that have to be validate to be accepted.
All properties respect the rules [https://json-schema.org/draft/2020-12/schema,](https://json-schema.org/draft/2020-12/schema,) some extra"format" can be add to mutualise recurrent regex pattern
To access a schema [https://wall-ants.ndda.fr/nationchains/schema/nations.json](https://wall-ants.ndda.fr/nationchains/schema/nations.json) and language specifique [https//:wall-ants.ndda.fr/nationchains/schema/lg/nations\_fr.json](https//:wall-ants.ndda.fr/nationchains/schema/lg/nations_fr.json)
To access a schema [https://wall-ants.ndda.fr/api/adminapi/schema/tribename/schamname.json](https://wall-ants.ndda.fr/nationchains/schema/nations.json) and language is set by the header in xlang
A checkjson.js is available to manage all specific format [https://wall-ants.ndda.fr/Checkjson.js](https://wall-ants.ndda.fr/Checkjson.js) see **Odmdb - schema Checkjson**
**Additional properties that not exist in 2020-12/schema :**
**required**: an array of required properties
**Additional properties that not exist in 2020-12/schema :**
**apxid**: the propertie used as an unique id
**apxuniquekey**: array of unique properties
**apxidx** : array of index
**apxidx** : array of index definition
**apxaccessrights**: object with key profilname and accessrights on properties {profilname:{C:\[properties array\],R:\[properties array\],U:\[\],D:\[\]}}
Items of an object are store in files into :  
```plaintext
/objectnames/idx/keyval_objkey.json
/objectnames/itm/uniqueid.json
tribename/objectnames/idx/keyval_objkey.json
tribename//objectnames/itm/uniqueid.json
```
## Accessrights:
An alias is just an identity, to access a tribe, a person must exist with an authenticated alias into /tribes/{tribename}/objects/persons/itm/{alias}.json
A person has a property profils with a list of profilename, common profiles are : anonymous (no identity) / pagan (an identity)  / person (an identity with access right into a tribe) / druid (the administrator of a tribe) / mayor (administrator of a town/server)/ and any profil can be define for a tribe
Each object has an apxaccessrights that is a list of profil and CRUD access per object key.
Example: owner on this object cab create delete an item is own, can read a list of propertie and update only some.
```plaintext
"owner": {
"C" : [],
"D": [],
"R": ["alias","owner","profils","firstname","lastname","dt_birth"],
"U": ["firstname","lastname","dt_birth"]
}
```
## api pre-request
API Endpoint url: **/api/{tribename}/{routename}/xxx**
Domaine name can be a adminapi donaim name aswell any tribe's domain  name. Check nginx conf in /tribename/nginx 
**Valid header see Middlewares**
App use openpgp.js lib to sign xdays\_xalias with a privatekey and store it in xhash.
App use openpgp.js lib to sign xalias\_xdays  (xdays a timestamp integer in miilisecond from Unix Epoch) with a privatekey and store it in xhash.
/middlewares/isAuthenticated.js check if (xhash) is a valid signature of the public key a xhash is valid for 24 hours
@ -108,17 +97,9 @@ C - a json multi answer **{status,multimsg:\[{ref,msg,data}\]}**
To show feedback context message in a language lg => get /api/adminapi/objects/tplstrings/{{model}}\_{{lg}}.json
This contain a json {msg:"mustache template string to render with data"}  
## Accessrights:
An alias is just an identity, to access a tribe, a person must exist with an authenticated alias into /tribes/{tribename}/objects/persons/itm/{alias}.json
A person has a property profils with a list of profilename, common profiles are : anonymous (no identity) / pagan (an identity)  / person (an identity with access right into a tribe) / druid (the administrator of a tribe) / mayor (administrator of a town/server)
Each object has an apxaccessrights that is a list of profil and CRUD access per object key .
## Add tribe's api:
Accessible with https://dns/api/tribename/routes
Accessible with https://dns/api/tribename/routename/
```plaintext
/tribes/tribename/apxtri/routes

View File

@ -52,6 +52,8 @@ const isAuthenticated = async (req, res, next) => {
glob.sync(`../../tmp/tokens/*.json`).forEach((f) => {
const fsplit = f.split("_");
const elapse = tsday - parseInt(fsplit[2]);
console.log("##############################")
console.log(fsplit,"--",fsplit[2])
//24h 86400000 milliseconde 15mn 900000
if (elapse && elapse > 86400000) {
fs.remove(f);

View File

@ -94,6 +94,7 @@ Checkjson.schema.properties.format = {
"idn-email": / /,
uuid: /^[0-9a-f]{8}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{4}-[0-9a-f]{12}$/,
uri: / /,
url: /^(?:(?:https?|ftp):\/\/)(?:\w+(?::\w+)?@)?(?:(?:[a-z0-9-\.]+\.[a-z]{2,})(?:[-a-z0-9+\._\%\!\\[\]\(\)\,\*\?\&\=\:]*){1,})|(?:(?:25[0-5]|2[0-4][0-9]|[0-1]?[0-9][0-9]?)\.(?:25[0-5]|2[0-4][0-9]|[0-1]?[0-9][0-9]?)\.(?:25[0-5]|2[0-4][0-9]|[0-1]?[0-9][0-9]?)\.(?:25[0-5]|2[0-4][0-9]|[0-1]?[0-9][0-9]?))(?:[:\/#][^#]*)?$/,
"uri-reference": / /,
iri: / /,
hostname: / /,

View File

@ -32,21 +32,23 @@ Notifications.get = (alias, tribeId) => {
*/
Notifications.statmaillist=(tribe)=>{
const statinfo={}
let csv="email/phone;name;srckey\n"
const src=`../../${tribe}/objects/maillinglists/*.json`
console.log(path.resolve(src))
glob.sync(src).forEach(f=>{
const name=path.basename(f,".json");
const mlst=fs.readJSONSync(f)
Object.keys(mlst).forEach(c=>{
csv+=`"${c}";"${name}";"${mlst[c].srckeys.join('-')}"\n`
mlst[c].srckeys.forEach(s=>{
if (!statinfo[s]) statinfo[s]={}
if (!statinfo[s][name]) statinfo[s][name]=0
statinfo[s][name]++
})
//console.log(c) recupere les contact tel ou email
})
})
// fichier csv stocker en local en attendant d'avoir un back pour stocker la reponse dans data.csv
fs.outputFileSync(`../../${tribe}/mailinglst.csv`,csv,"utf-8");
return {status:200,ref:"Notifications",msg:"statistics",data:statinfo}
}

View File

@ -156,53 +156,102 @@ Odmdb.updateObject = (objectPathname, meta) => {};
* @lg language you want to get schema
* @return {status:200,data:{conf:"schemaconf",schema:"schemacontent"} }
*/
Odmdb.Schema = (objectPathname, validschema, lg="en") => {
const replacelg = (data)=>{
Odmdb.Schema = (objectPathname, validschema, lg = "en") => {
const replacelg = (data) => {
// data.en version schema de base, data.fr version schema traduite
Object.keys(data.lg).forEach(k=>{
console.log(k)
if (data.lg[k].title) data.en[k].title = data.lg[k].title
if (data.lg[k].description) data.en[k].description = data.lg[k].description
if (data.lg.properties){
console.log('properties')
console.log(data.en.properties)
console.log(data.lg.properties)
const res = replacelg({en:data.en.properties,lg:data.lg.properties})
data.lg.properties=res.lg
data.en.properties=res.en
Object.keys(data.lg).forEach((k) => {
console.log(k);
if (data.lg[k].title) data.en[k].title = data.lg[k].title;
if (data.lg[k].description)
data.en[k].description = data.lg[k].description;
if (data.lg.properties) {
console.log("properties");
console.log(data.en.properties);
console.log(data.lg.properties);
const res = replacelg({
en: data.en.properties,
lg: data.lg.properties,
});
data.lg.properties = res.lg;
data.en.properties = res.en;
}
})
return data
}
const getschemalg = (schemaPath,lg) => {
});
return data;
};
const getschemalg = (schemaPath, lg) => {
if (schemaPath.slice(-5) != ".json") schemaPath += ".json";
if (schemaPath.substring(0, 4) == "http") {
// lance requete http pour recuperer le schema avec un await axios
} else {
schemaPath = `../../${schemaPath}`;
if (log) console.log(currentmod,"resolve path schemaPath:",path.resolve(schemaPath))
if (log)
console.log(
currentmod,
"resolve path schemaPath:",
path.resolve(schemaPath)
);
if (!fs.existsSync(schemaPath)) {
return {};
} else {
let schemalg = fs.readJsonSync(schemaPath);
if (lg!="en"){
let lgtrans={}
try{
lgtrans=fs.readJsonSync(schemaPath.replace('/schema/','/schema/lg/').replace('.json',`_${lg}.json`));
const res= replacelg({en:schemalg,lg:lgtrans})
if (lg != "en") {
let lgtrans = {};
try {
lgtrans = fs.readJsonSync(
schemaPath
.replace("/schema/", "/schema/lg/")
.replace(".json", `_${lg}.json`)
);
const res = replacelg({ en: schemalg, lg: lgtrans });
//console.log(res.en.title,res.lg.title)
schemalg=res.en
}catch(err){
schemalg = res.en;
} catch (err) {
// console.log('Err',err)
// no translation file deliver en by default
}
}
return schemalg
return schemalg;
}
}
};
if (log) console.log(currentmod,`${objectPathname}/conf.json`);
const convoptionstoenum=(propertie,lg)=>{
if (!propertie.options) return propertie;
if (!(propertie.options["$ref"])){
propertie.msg="missingref"
return propertie
}
let optionsfile;
let optionstype;
if (propertie.options["$ref"].includes("/options/")) {
optionstype = "options";
optionsfile = path.resolve(
`../../${propertie.options["$ref"]}_${lg}.json`
);
}
if (propertie.options["$ref"].includes("/idx/")) {
optionstype = "idx";
optionsfile = path.resolve(
`../../${propertie.options["$ref"]}.json`
);
}
if (log) console.log(currentmod, "Lien vers options:", optionsfile);
if (!fs.existsSync(optionsfile)) {
propertie.msg = "missingref";
return propertie;
} else {
delete propertie.options
if (optionstype == "options") {
propertie.enum =
fs.readJSONSync(optionsfile).lst_idx;
}
if (optionstype == "idx") {
propertie.enum = fs.readJSONSync(optionsfile);
}
}
return propertie
}
if (log) console.log(currentmod, `${objectPathname}/conf.json`);
const res = {
status: 200,
ref: "Odmdb",
@ -211,58 +260,58 @@ Odmdb.Schema = (objectPathname, validschema, lg="en") => {
};
if (fs.existsSync(`${objectPathname}/conf.json`)) {
res.data.conf=fs.readJsonSync(`${objectPathname}/conf.json`);
res.data.schema = getschemalg(res.data.conf.schema,lg)
}else{
res.data.conf={}
res.data.conf = fs.readJsonSync(`${objectPathname}/conf.json`);
res.data.schema = getschemalg(res.data.conf.schema, lg);
} else {
res.data.conf = {};
}
if (!res.data.schema || Object.keys(res.data.schema).length == 0 ) {
if (!res.data.schema || Object.keys(res.data.schema).length == 0) {
return {
status: 404,
ref: "Odmdb",
msg: "schemanotfound",
data: { objectPathname:path.resolve(objectPathname), schema: {} },
data: { objectPathname: path.resolve(objectPathname), schema: {} },
};
}
//@todo only 1 level $ref if multi level need to rewrite with recursive call
// get $ref from $def
if (res.data.schema["$defs"]){
Object.keys(res.data.schema["$defs"]).forEach(ss=>{
Object.keys(res.data.schema["$defs"][ss].properties).forEach(pp=>{
res.data.schema["$defs"][ss].properties[pp]=convoptionstoenum(res.data.schema["$defs"][ss].properties[pp],lg)
})
})
}
Object.keys(res.data.schema.properties).forEach((p) => {
//looking for type:object with $ref to load and replace by ref content (ref must be adminapi/ or tribeid/)
if (
res.data.schema.properties[p].type == "object" &&
res.data.schema.properties[p]["$ref"]
) {
const subschema = path.resolve(`../../${res.data.schema.properties[p]["$ref"]}.json`);
if (Object.keys(res.data.schema).length == 0) {
res.status = 404;
res.msg = "missingref";
res.data.missingref = res.data.schema.properties[p]["$ref"];
return res;
} else {
subschema.description += ` from external schema: ${res.data.schema.properties[p]["$ref"]}`;
res.data.schema.properties[p] = subschema;
}
}
//`../../${req.session.header.xtribe}/objects/persons`
//looking for options:{"$ref":"../objects/options/xxx.json"}
//to add enum:[] = content of options available in
let subschema
const localdef=res.data.schema.properties[p]["$ref"].includes("#/")
if (
res.data.schema.properties[p].options &&
res.data.schema.properties[p].options["$ref"]
localdef &&
!(res.data.schema["$defs"] && res.data.schema["$defs"][propertie["$ref"]])
) {
const optionsfile = path.resolve(`../../${res.data.schema.properties[p].options["$ref"]}_${lg}.json`)
if (log) console.log(currentmod,"Lien vers options:", optionsfile)
if (!fs.existsSync(optionsfile)){
res.status = 404;
res.msg = "missingref";
res.data.missingref = res.data.schema.properties[p].options["$ref"];
res.msg = "missinglocalref";
res.data.missingref = propertie;
return res;
}else{
if (!res.data.schema.apxref) {res.data.schema.apxref=[]}
if (!res.data.schema.apxref.includes(res.data.schema.properties[p].options["$ref"]))
res.data.schema.apxref.push(res.data.schema.properties[p].options["$ref"])
res.data.schema.properties[p].enum=fs.readJSONSync(optionsfile).lst_idx
}
if (localdef) {
res.data.schema.properties[p]=res.data.schema["$defs"][res.data.schema.properties[p]["$ref"]]
}else{
subschema = Odmdb.Schema(path.resolve(res.data.schema.properties[p]["$ref"]), validschema, lg)
if(subschema.status==200){
res.data.schema.properties[p]=subschema.data.schema;
}else{
subschema.data.originschemaproperty=p
return subschema
}
}
}
if (res.data.schema.properties[p].options){
//remplace options par enum:[]
res.data.schema.properties[p]=convoptionstoenum(res.data.schema.properties[p],lg)
}
});
@ -487,7 +536,7 @@ Odmdb.cud = (objectPathname, crud, itm, role, runindex = true) => {
const existid = fs.existsSync(
`${objectPathname}/itm/${itm[getschema.data.schema.apxid]}.json`
);
if (log) console.log(currentmod,"Pass schema itm existid = ", existid)
if (log) console.log(currentmod, "Pass schema itm existid = ", existid);
/*const pathindex = `${objectPathname}/idx/lst_${getschema.data.schema.apxid}.json`;
if (!fs.existsSync(pathindex)) {
fs.outputJSONSync(pathindex, []);
@ -545,7 +594,7 @@ Odmdb.cud = (objectPathname, crud, itm, role, runindex = true) => {
(crud == "D" && !accessright.D) ||
(crud == "U" && !accessright.U)
) {
if (log) console.log(currentmod,"Forbidden accessright:", accessright);
if (log) console.log(currentmod, "Forbidden accessright:", accessright);
return {
status: 403,
ref: "Odmdb",
@ -581,14 +630,18 @@ Odmdb.cud = (objectPathname, crud, itm, role, runindex = true) => {
false
);
if (chkdata.status != 200) {
if (log) console.log(currentmod,"Unconsistency data", chkdata);
if (log) console.log(currentmod, "Unconsistency data", chkdata);
return chkdata;
}
if (log) console.log(currentmod,"Data compliance with schema");
if (log) console.log(currentmod, "Data compliance with schema");
if (!getschema.data.schema.apxuniquekey)
getschema.data.schema.apxuniquekey = [];
if (log) console.log(currentmod,`${objectPathname}/itm/${chkdata.data.apxid}.json`);
if (log) console.log(currentmod,chkdata.data.itm);
if (log)
console.log(
currentmod,
`${objectPathname}/itm/${chkdata.data.apxid}.json`
);
if (log) console.log(currentmod, chkdata.data.itm);
fs.outputJSONSync(
`${objectPathname}/itm/${chkdata.data.apxid}.json`,
chkdata.data.itm
@ -596,7 +649,7 @@ Odmdb.cud = (objectPathname, crud, itm, role, runindex = true) => {
}
//if (log) console.log(currentmod,"getschema", getschema);
//rebuild index if requested
if (log) console.log(currentmod,"runidx", runindex);
if (log) console.log(currentmod, "runidx", runindex);
if (runindex) Odmdb.runidx(objectPathname, getschema.data.schema);
getschema.data.conf.lastupdatedata = dayjs().toISOString();
fs.outputJSONSync(`${objectPathname}/conf.json`, getschema.data.conf);
@ -619,7 +672,7 @@ Odmdb.cud = (objectPathname, crud, itm, role, runindex = true) => {
*
*/
Odmdb.runidx = (objectPathname, schema) => {
if (log) console.log(currentmod,`idx for ${objectPathname}`);
if (log) console.log(currentmod, `idx for ${objectPathname}`);
if (!schema || !schema.apxid) {
const getschema = Odmdb.Schema(objectPathname, true);
if (getschema.status != 200) return getschema;
@ -701,8 +754,8 @@ Odmdb.runidx = (objectPathname, schema) => {
itm[ventil[n].keyval.split(".")[0]]
) {
let itmval = JSON.parse(JSON.stringify(itm));
if (log) console.log(currentmod,ventil[n].keyval);
if (log) console.log(currentmod,itmval);
if (log) console.log(currentmod, ventil[n].keyval);
if (log) console.log(currentmod, itmval);
ventil[n].keyval
.split(".")
.forEach((i) => (itmval = itmval[i] ? itmval[i] : null));
@ -752,15 +805,16 @@ Odmdb.ASUPidxfromitm = (
idxs = [],
schema
) => {
if (log) console.log(currentmod,`idxfromitem for ${objectPathname} action:${crud}`);
if (log)
console.log(currentmod, `idxfromitem for ${objectPathname} action:${crud}`);
if (!schema || !schema.apxid) {
const getschema = Odmdb.Schema(objectPathname, true);
if (getschema.status != 200) return getschema;
schema = getschema.data.schema;
}
if (log) console.log(currentmod,schema.apxuniquekey);
if (log) console.log(currentmod, schema.apxuniquekey);
const itms = crud == "I" ? glob.sync(`${objectPathname}/itm/*.json`) : [itm];
if (log) console.log(currentmod,itms);
if (log) console.log(currentmod, itms);
if (crud == "I") {
//reinit all idx
idxs.forEach((idx) => {
@ -787,8 +841,8 @@ Odmdb.ASUPidxfromitm = (
idxtoreindex.push(idx); //@todo
}
}
if (log) console.log(currentmod,idx.keyval);
if (log) console.log(currentmod,itm[idx.keyval]);
if (log) console.log(currentmod, idx.keyval);
if (log) console.log(currentmod, itm[idx.keyval]);
if (
["C", "U", "I"].includes(crud) &&

View File

@ -32,16 +32,6 @@ At each reboot run a process to analyse /apxtri/routes and api/models whre only
4 to delete a user sudo userdel smatchit (this keep folder smatchit to remove folder smatchit => sudo userdel --remove smacthit)
/tribes/tribeid
Manage a tribeid space
* create

View File

@ -302,16 +302,14 @@ router.get(
* @apiParam {String} objectname name Mandatory if in conf.nationObjects then file is into nationchains/ else in /tribes/xtribe/objectname
* @apiParam {String} primaryindex the unique id where item is store
*
* @apiError {json} objectNotfound the file item does not exist
* @apiError {json} objectfiledoesnotexist the file item does not exist
* @apiErrorExample {json}
* HTTP/1.1 404 Not Found
* {"status":404,"ref":"Odmdb","msg":"doesnotexist","data":{"objectname":"objectname","key":"apxid","val":"primaryindex"}}
*
* {status:404,ref: "Odmdb",msg: "objectfiledoesnotexist",data: { objectpath }}
* @apiSuccess {object} indexfile content
* @apiSuccessExample {json} Success-Response:
* HTTP/1.1 200 OK
* {"status":200, "ref":"Odmdb", "msg":"indexexist", "data":{"indexname","content":{itm file}}
*
* {status:200,ref:"Odmdb",msg:"itmfound", data:{itm:{} }} *
*
*/
// indexname = objectname_key_value.json
@ -323,9 +321,10 @@ router.get(
const objectpath = `../../${req.params.tribe}/objects/${req.params.objectname}/itm/${req.params.primaryindex}.json`;
if (fs.existsSync(objectpath)) {
res.status(200).json({ data: fs.readJsonSync(objectpath) });
res.status(200).json({status:200,ref:"Odmdb",msg:"itmfound", data:{itm: fs.readJsonSync(objectpath) }});
} else {
res.status(404).json({
status:404,
ref: "Odmdb",
msg: "objectfiledoesnotexist",
data: { objectpath },

View File

@ -234,6 +234,8 @@ router.post( '/downloadls', checkHeaders, isAuthenticated, ( req, res ) => {
.send( 'Forbidden access' )
}
} );
router.post( '/upfilepond', checkHeaders, isAuthenticated, ( req, res ) => {
console.log( 'post adminapi/tribes/uploadfilepond' );
// Store file and return a unique id to save button