Blog & Articles
Due DiligenceTechnical AuditDigital Credentials

How to evaluate and choose a digital credentials platform

A complete due diligence framework — technical audit, functional validation, blockchain verifiability, master evaluation form, and comparative matrix for universities and educational institutions.

merahki.ai·April 2026·v1.0

Core idea

Comparing demos, commercial claims, or feature lists is not enough. An institution must demand verifiable evidence: implemented standards, active certifications, functional tests, proven integrations, data architecture, privacy controls, operational continuity, and a real verifiability audit of the credential.

Table of contents

01Objectives, scope, and evaluation principles
02Recommended selection process
03What to request from the vendor
04Evaluation dimensions and required evidence
05How to validate declared features
06Technical audit and blockchain verifiability
07Red flags and discard criteria
08Master institutional evaluation form
09Demo, pilot, and testing script
10Scoring methodology and comparative matrix
11Economic, contractual, and exit plan evaluation
12Final institutional recommendation
AQuick discard checklist
BRFI/RFP template
CStandards and frameworks references

01 · Objectives, scope, and principles

A rigorous evaluation, not a commercial demo.

This document enables universities to evaluate digital credentials platforms with technical, operational, legal, and functional rigor before signing a contract. It can be used as a market guide, internal committee document, due diligence checklist, RFI/RFP form, pilot test basis, or comparison matrix across multiple vendors.

The institution is not just buying a tool to issue visually attractive badges or certificates. It is choosing critical infrastructure for digital reputation, verification, portability, traceability, privacy, employability, academic integration, and future continuity.

Six guiding principles

01

Prioritize interoperability and portability

The institution must avoid being locked into a vendor or a proprietary viewer.

02

Distinguish standard, implementation, and certification

Saying a platform 'supports' a standard does not equal demonstrating a real, verifiable implementation.

03

Evaluate the full product, not just issuance

Analyze issuance, verification, revocation, corrections, recipient experience, integrations, analytics, operational governance, and future exit.

04

Demand evidence, not just claims

Every relevant claim must be backed by documents, tests, sandbox, logs, certificates, reports, or verifiable references.

05

Compare against real use cases

Validation must cover concrete scenarios: diploma, micro-credential with skills, stackable badge, LMS/SIS integration, public verification, and revocation.

06

Decide with weighted criteria

There must be a scoring system, minimum thresholds, and discard red flags.

Ten domains of minimum scope

Information security
Open standards and interoperability
Regulatory compliance and certifications
Blockchain, verifiability, and credential audit
Product features and user experience
Integrations, APIs, LMS, SIS, SSO, and webhooks
Privacy, consent, and data governance
Cloud infrastructure, continuity, and resilience
Monitoring, traceability, and operational audit
Operational sustainability, support, roadmap, and exit plan

02 · Recommended selection process

Seven phases. Skipping steps is expensive.

Skipping phases usually results in purchases based on commercial pitch, hidden costs, and painful migrations.

PhaseObjectiveDeliverableWhat it validatesProgress signal
1. Internal definitionAlign academic, technical, legal, and employability objectivesPrioritized use cases, success criteria, and ownersWhat you want to issue and whyThe institution knows exactly what it needs
2. Initial RFIFilter vendors that don't meet minimumsDocumentary response and evidenceSeriousness, standards, certifications, integrations, and supportOnly viable vendors remain
3. Guided demoCompare real flows under the same scriptDemo record with observationsCommercial claims and real UXClaims are validated against actual product
4. Due diligenceReview security, privacy, architecture, and complianceFull checklist with traffic-light statusHidden risks not visible in a demoStructural weaknesses are detected
5. Controlled pilotTest with real institutional data and processesPilot results and feedbackReal operation, timelines, and adoptionIt works in institutional context
6. Economic and contractual reviewUnderstand total cost and future exitTCO, SLA, DPA, exit planHidden costs, lock-in, and continuityEconomic comparison is realistic
7. Decision and implementationChoose and start with clear governanceFinal matrix and rollout planOverall vendor maturityStart with owners, KPIs, and scope defined

Practical recommendation

The vendor should not see the full weighting before responding. Evidence first; then the institution applies the scoring. This prevents “optimized” answers without real substance.

03 · What to request from the vendor

The minimum evidence package.

If the vendor cannot or will not provide it, that is already a risk signal.

Document / evidenceWhy it mattersAcceptable if…
Technical product sheet and architectureUnderstand real scope, limits, modules, dependencies, and data modelDescribes environments, architecture, main flows, modules, APIs, and dependencies
API documentation + sandboxValidates real integrabilityIncludes endpoints, auth, examples, rate limits, versioning, and test environment
Supported standards matrixPrevents vague claimsStates exact standard, version, scope, and conformance evidence
Active certifications listProves maturityIncludes body, validity, scope, and audit date
DPA/DPSA and privacy policyReviews legal obligationsClarifies roles, sub-processors, international transfers, and data subject rights
Sub-processors listShows the data treatment chainIncludes vendor, function, country, and contractual safeguards
SLA and support schemeAllows service enforcementDefines times, channels, severity levels, escalation, and coverage hours
Continuity / disaster recovery planEvaluates resilienceStates backups, RTO, RPO, and periodic testing
Pentest report or executive letterMeasures real securityRecent, by independent third party, with documented remediations
Reference customersContrasts claims with real useCases comparable to the institution
Exit planPrevents lock-inExplains export, verification continuity, revocations, and exit costs

merahki.ai + pok.tech

Already know what to ask. We already have the answers.

merahki.ai, powered by pok.tech, delivers the complete evidence package this guide requires: open standards, active ISO 27001, blockchain-verifiable credentials, documented APIs with sandbox, and a clear exit plan.

See the certification solution

04 · Evaluation dimensions and required evidence

Ten pillars. Each with documentable evidence.

4.1

Information security

  • Mandatory MFA for administrative and issuer profiles.
  • Role management with least-privilege principle and immediate access revocation.
  • Encryption in transit with TLS 1.2+ and encryption at rest, ideally AES-256.
  • Auditable logs for access, permission changes, issuance, revocation, export, and critical events.
  • Vulnerability scans, external pentests, secure SDLC, code review, and dependency management.
  • Incident response plan, clear owners, and notification SLA.

How to validate it

Request a live demo of user creation, role change, and revocation; ask for logs; request evidence of encryption policies and an executive summary of security audits.

4.2

Open standards and interoperability

  • Open Badges 3.0 implemented natively, not just mentioned commercially.
  • W3C Verifiable Credentials when the product claims to issue verifiable credentials aligned to that model.
  • CLR when the project needs a broader longitudinal or academic record.
  • LTI 1.3, OneRoster, CASE, or other frameworks when the institution needs formal academic integrations.
  • European Learning Model (ELM) and Europass compatibility for alignment with the European ecosystem, academic mobility, semantic portability, or issuance of European Digital Credentials for Learning.
  • Ability to export in standard formats and validate with independent tools.

How to validate it

For any standard: exact version, concrete scope, issued examples, documentation, external validators, and where it exists, certification or presence in official directories.

4.3

Regulatory compliance and certifications

  • Active ISO/IEC 27001 for information security management.
  • SOC 2 Type II where applicable, covering Security, Availability, Confidentiality, Processing Integrity, and Privacy.
  • Compliance with GDPR, LGPD, CCPA, LFPDPPP, FERPA, or other relevant regulations.
  • Clear DPA/DPSA, identified sub-processors, and international transfer mechanisms.

How to validate it

Do not accept generic responses like 'we comply with GDPR'. Request contractual roles, policies, data subject rights flows, data map, and concrete mechanisms for deletion, export, and incident response.

4.4

Blockchain and verifiability

  • Precise statement of which blockchain is used and what function it serves.
  • Exact explanation of what is recorded on-chain vs. off-chain.
  • Confirmation that no personal data is stored on-chain.
  • Public validator and ability for independent verification on an explorer.
  • Ability to audit hash, timestamp, status, and traceability of a credential.
4.5

Technical product features

  • Individual and bulk issuance.
  • Templates, branding, multi-language, academic units, and role-based permissions.
  • Skills, competencies, outcomes, evidence, rubrics, pathways, stackability, and credential relationships.
  • Revocation, expiration, renewal, reissuance, controlled correction, and versioning.
  • Recipient wallet or locker, sharing, download, public verification, accessibility, and mobile experience.
  • Operational analytics and reports useful for academic and employability management.
4.6

Integrations and APIs

  • Documented APIs with clear auth, sandbox, examples, rate limits, versioning, and backward compatibility.
  • SSO with SAML 2.0, OAuth 2.0, or OpenID Connect as applicable.
  • LMS, SIS, CRM, ERP, HRIS, assessment platforms, and messaging.
  • Webhooks or events for issuance, revocation, claim, expiration, or other flows.
  • LTI 1.3 and other edtech frameworks when applicable.
4.7

Privacy and data management

  • Exact map of personal and academic data: required, optional, and derived.
  • Data location by environment and by customer.
  • Retention, deletion, correction, anonymization, export, and right of access.
  • Explicit consent when applicable and auditable record of that consent.
  • Ability to separate verification continuity from personal data deletion.
4.8

Cloud infrastructure

  • Hosting region or country.
  • Redundancy, backups, high availability, and multi-tenant or single-tenant architecture.
  • RTO, RPO, operational continuity, and disaster recovery testing.
  • Secrets management, key management, and separated environments.
4.9

Monitoring and audit

  • Log retention period and detail level.
  • Security alerts and traceability of administrative actions.
  • Log integrity, exportability, and forensic analysis support.
  • Monitoring panels or reports available to the institution.
4.10

Operational sustainability

  • Clear and consistent product roadmap.
  • Support team and real implementation capacity.
  • Business viability and vendor continuity.
  • Community, partnerships, references, and service stability.

05 · How to validate declared features

Two vendors can say “yes” to the same thing and offer radically different products.

A feature should only be considered available if it was both documented and demonstrated end-to-end.

  • Require a demonstration based on a common script prepared by the university.
  • Request temporary access to a sandbox or test tenant.
  • Ask for documentation, screenshots, short video, or walkthrough for each critical feature.
  • Use an institution-specific test case with real or semi-real data.
  • Verify the feature end-to-end: configuration, issuance, student experience, external verification, post-correction, and administration.
  • Record evidence in a record with traffic-light status: available, partially available, requires additional development, depends on partner/professional services, or not available.
  • Do not score roadmap, marketplace items, or third-party dependencies as natively available features.

Nine tests with 0–5 score

Critical featureRequired testExpected result
Bulk issuanceIssue a real batch with a test fileIssued correctly with traceability and controlled errors
Public verificationValidate credential without login from external linkClear, public, and consistent verification
RevocationRevoke and re-validateStatus changes correctly and is audited
Correction / updateEdit allowed field or reissue per policyHistory trace and coherence maintained
Skills / competenciesMap skill, level, evidence, and criteriaStructure is visible and verifiable
LTI / LMSLaunch from LMS and capture contextFlow works with declared standard
Issuance APIIssue via API with test credentialsIssuance works with documented auth and response
AnalyticsExtract dashboard or useful reportData is usable for management
ExportDownload data and credentialsFormat is standard and usable

06 · Technical audit and blockchain verifiability

The word “blockchain” is often used without technical rigor.

Critical warning

This section describes how to audit whether a credential was actually registered correctly and verifiably — or whether the vendor is only presenting a marketing narrative.

6.1 · Seven non-negotiable questions for the vendor

  • Which blockchain exactly? Ethereum, Polygon, LACNet, Bitcoin, Solana, Hyperledger, private chain, sidechain, or other.
  • Who operates the nodes and whether the network is public, permissioned, or fully private.
  • What data is recorded on-chain: hash, identifier, cryptographic proof, contract event, minimal metadata, or the full credential.
  • What data is stored off-chain and where it is hosted.
  • How is integrity validation performed for a credential and how is revocation proven.
  • How does it avoid storing personal data on-chain.
  • What is the relationship between the credential viewer, the validation button, and the transaction shown on the explorer.

6.2 · Mandatory on-chain audit procedure (10 steps)

The university must open a real credential from the vendor, validate it using the validation button, then click the hash, icon, or blockchain link shown on the credential, and verify in the explorer that the transaction was correct. The transaction must not be in a failed, reverted, or dropped state.

01

Open a real credential from the vendor — preferably a public credential issued by a client or by the vendor itself.

02

Use the credential's validation button. The validator must confirm the credential's status and expose, directly or indirectly, the link to the cryptographic proof or transaction.

03

Click the hash, icon, or blockchain link shown on the credential or validator. That link should lead to a blockchain explorer or equivalent publicly verifiable reference.

04

Once in the explorer, verify that the transaction exists and its status is successful. It must not appear as failed, reverted, dropped, cancelled, or equivalent states for that chain.

05

Review the transaction hash, timestamp, block, address or contract involved, and consistency with the credential's issuance date.

06

Review all relevant tabs in the explorer: Overview, Logs, Token Transfers, Internal Transactions, Events, or State. Do not stop at the transaction's overview page.

07

Confirm there is no misleading situation where the explorer shows a failed transaction, no token transfers, or failed internal transactions.

08

Verify that the recorded data or emitted event is reasonably connected to the audited credential. If the vendor only shows a generic explorer link with no way to connect it, that proof is insufficient.

09

Repeat the validation on more than one credential if possible: one active, one revoked, and one recently issued.

10

Document screenshots, URLs, transaction status, and any inconsistencies detected.

6.3 · Explorer review operational matrix

ElementWhat should be visibleRed flag
StatusSuccess, confirmed, or equivalent. The transaction must exist and have been processed correctly.Failed, reverted, dropped, cancelled, indefinite pending, or unexplained error.
Token TransfersOnly if the model actually implies minting or transfer. Must be consistent with the audited credential.No transfers when vendor claims minting/NFT; transfers inconsistent with date, contract, or recipient.
Internal TransactionsMust be consistent with contract logic, or absent if the architecture doesn't use them.Failed internal transactions, reverts, or inconsistent traces without sufficient technical explanation.
Logs / EventsContract events or logs that link the on-chain evidence to the credential.No way to connect the credential to the event, or logs contradict the viewer.
Timestamp, block, and contractMust be consistent with issuance date, network, and contract declared by the vendor.Relevant time differences, unidentified contract, or different network than declared.

Not all architectures register a credential as a token transfer. Some only record hashes, events, or cryptographic proofs. Verification is not about “seeing an NFT” — it is about confirming that the real on-chain proof exists, was successful, and is consistently linked to the audited credential.

6.4 · Six minimum practical verifiability tests

Test 1

Test credential generation

Request a test credential, download it or view it in the viewer, and extract its identifiers.

Test 2

Independent hash or txid verification

Use the explorer directly — do not rely solely on the vendor's viewer.

Test 3

Cryptographic signature validation

Or verifiable structure validation where the standard allows it.

Test 4

Revocation

Revoke the test credential and confirm that the status changes in the validator and, when applicable, in the on-chain evidence.

Test 5

Public validator

Confirm a third party can verify the credential without login and without exposing unnecessary data.

Test 6

Portability

Export the credential or its metadata in standard format and confirm it remains usable.

6.5 · Nine blockchain-washing red flags

Cannot explain exactly which blockchain they use.
Cannot show a real transaction on a public or verifiable explorer.
Validation only works inside the proprietary viewer.
The validation button does not expose independently verifiable evidence.
The explorer shows failed transactions, failed internal transactions, or time inconsistencies.
No reasonable way to link the credential to the on-chain data.
Personal data appears to be stored directly on-chain.
Charges for credential verification, or verification depends entirely on the vendor's commercial continuity.
Talks about 'immutability' or 'NFT' without being able to show auditable technical proof.

07 · Red flags and discard criteria

Situations that justify stopping an evaluation.

Unless the vendor can address them with strong, verifiable evidence.

Absence of relevant technical documentation.
Refusal to show sandbox or functional tests.
Standards claims without documentation, validators, or examples.
No response on sub-processors, data location, or DPA/DPSA.
No MFA for issuers and administrators.
Inability to export data and credentials in usable formats.
No exit plan or verification continuity after contract ends.
Ambiguous blockchain use without independent proof.
Expired, partial, or non-applicable certifications.
Roadmap presented as currently available product.

merahki.ai + pok.tech

None of these red flags apply to us.

Full technical documentation. Auditable blockchain trail on public explorers. Active ISO 27001. DPA-ready. APIs with sandbox. A clear exit plan — all verifiable, no commercial pitch needed.

Verify it yourself

08 · Master institutional evaluation form

How to score each item.

This battery can be used as an RFI/RFP or due diligence form. Responses should include attached evidence, and each item marked as: Documented · Demonstrated · Certified · Pending · Not available.

8.1

Product and functional scope

01

What types of credentials does the platform issue and manage?

02

Does it support individual and bulk issuance, renewal, expiration, revocation, reissuance, and versioning?

03

Does it support skills, competencies, learning outcomes, evidence, rubrics, or alignments?

04

Does it support pathways, stackability, equivalencies, or relationships between credentials?

05

Does it support multiple brands, academic units, campuses, languages, and role-based permissions?

8.2

Standards and interoperability

01

Which standards does it support exactly? State version and scope: Open Badges, W3C Verifiable Credentials, CLR, European Learning Model (ELM), Europass / European Digital Credentials for Learning, LTI, OneRoster, CASE, or others.

02

Can the vendor demonstrate semantic compatibility and/or practical interoperability with Europass or ELM? Attach field mapping, issued examples, validation, and functional evidence.

03

Which parts of the standard are natively implemented vs. require additional development?

04

Is there external certification, validation, or presence in official directories?

05

How is cryptographic verification, revocation, and portability between systems handled?

06

What dependency exists on proprietary viewers or wallets?

8.3

Integrations

01

What APIs are available? Attach documentation, auth, rate limits, and versioning.

02

Are there webhooks, queues, scheduled exports, or native connectors?

03

Which integrations exist with LMS, SIS, CRM, ERP, HRIS, assessment platforms, and SSO?

04

Does it support SAML, OAuth 2.0, OpenID Connect, or SCIM where applicable?

05

How are students, courses, results, cohorts, and changes synchronized?

8.4

Security

01

Is MFA required for administrators and issuers?

02

How is privileged access, tenant segregation, and immediate user revocation managed?

03

What encryption is used in transit and at rest?

04

Are auditable logs maintained? For how long? How is integrity and monitoring guaranteed?

05

How often are vulnerability scans and external pentests performed?

06

What secure SDLC, code review, dependency management, and CI/CD practices are applied?

07

What is the incident management and customer notification process?

8.5

Privacy and data

01

What personal and academic data is stored exactly? Separate required, optional, and derived.

02

Where is data hosted by environment and by customer? State country or region.

03

Who are the sub-processors and what role do they play?

04

How are retention, deletion, anonymization, correction, and export managed?

05

What mechanisms are used for international data transfers?

06

How is GDPR, FERPA, and applicable local legislation addressed?

8.6

Operation and service

01

What is the standard SLA and what does support include?

02

What language and time zone does support cover?

03

How is implementation done, how long does it take, and what depends on the customer?

04

What training is offered to administrators, issuers, and internal support?

05

What comparable references can they share?

8.7

Contract, costs, and exit

01

Describe the pricing model and all billable components.

02

State usage limits: storage, issuers, templates, integrations, wallets, analytics, and environments.

03

What happens at contract end for verification, hosting, and data export?

04

Is there a cost for migration, exit, or verification continuity?

05

Does the university retain ownership and control over its data and metadata?

09 · Demo, pilot, and testing script

Twelve steps of the pilot.

01

Configure a credential with institutional branding, metadata, criteria, and evidence.

02

Issue an individual credential and a bulk batch.

03

Assign skills or competencies and display them in the credential or its detail view.

04

Perform claim by the recipient and share it externally.

05

Verify the credential from outside the platform.

06

Audit the validation button and blockchain/explorer link where applicable.

07

Revoke a credential and verify the status change.

08

Correct an allowed field and review the audit trail.

09

Consume a sample API or webhook.

10

Show role permissions and segregation by academic units.

11

Export data, metadata, and relevant evidence.

12

Test accessibility, mobile experience, and multi-language where required.

10 · Scoring methodology

Scale 0 – 5.

Scoring must only be applied to documented and demonstrated features or controls. Roadmap items do not count as current availability.

ScoreMeaningInterpretationAcceptance
0Does not complyNot implemented or explicitly unsupported🔴 Rejection
1Very weak / roadmapPromised or partially conceptual🟠 Do not count as available feature
2PartialExists with significant limitations🟠 Requires additional investigation
3AdequateCorrectly implemented with sufficient evidence🟣 Meets minimum
4SolidRobustly implemented with auditable maturity🟢 Very good level
5ExcellentExceptional, transparent, leading implementation🟢 Clear strength

Suggested weighting by dimension

DimensionWeightMin. threshold
Standards and interoperability20%≥ 3/5
Product features20%≥ 3/5
Security15%≥ 4/5
Privacy and data15%≥ 4/5
Blockchain and verifiability10%≥ 3/5
Integrations and APIs10%≥ 3/5
Operations and support5%≥ 3/5
Total economics3%≥ 3/5
Exit plan / no lock-in2%≥ 3/5

Suggested formula

Final score = Σ (dimension score × weight)

Interpretation

  • ≥ 4.0 · recommended
  • 3.0 – 3.9 · acceptable with reservations
  • < 3.0 · not recommended

Governance rule

Even if the overall score is high, a vendor should not pass if it does not meet minimums in security, privacy, or verifiability.

11 · Economic, contractual, and exit plan evaluation

Total cost, real TCO, and exit.

  • Review pricing per issuer, per student, per credential, per module, per integration, or per storage.
  • Identify hidden costs: implementation, branding, APIs, analytics, premium support, migration, wallet, templates, environments, and training.
  • Require SLA, DPA, liability limits, sub-processors, continuity, backups, and incident handling.
  • Confirm what happens at contract end: data export, revocations, verification continuity, costs, and delivery format.
  • Verify that the university retains ownership and control over its data, metadata, and evidence.

12 · Final institutional recommendation

Do not choose based on aesthetics or commercial pitch.

A university should choose a platform based on demonstrated capacity to issue, verify, integrate, preserve, protect, and govern credentials and data with open standards, verifiable evidence, and an understandable total cost.

Best practice is to combine documentary form, guided demo, due diligence, real pilot, weighted scoring, and contractual review. When a vendor truly has the capability they claim, this process strengthens their case. When they don't, this process exposes it.

Every mature evaluation should include the practical audit of a real credential and its validation evidence. If the credential cannot be verified reliably and independently, the promise of verifiability is seriously undermined.

Annex A · Quick discard checklist

Eight traffic-light questions.

Does it have active ISO 27001 and, if applicable, SOC 2 Type II?
Does it implement open standards with an exact version and real evidence?
Can it demonstrate, when claimed, real compatibility with Europass and/or the European Learning Model (ELM), with technical evidence and functional validation?
Can a credential be verified independently and its technical evidence audited?
Does it have documented APIs, sandbox, and auditable logs?
Does it comply with privacy requirements and avoid storing personal data on-chain?
Does it allow exporting data and credentials in usable formats?
Does it have an exit plan and verification continuity?

If the answer is no to any of these questions and the vendor does not provide evidence of immediate remediation, the platform should move to discard or pause status.

Annex B · RFI / RFP template

Ten blocks of an RFI / RFP.

01Institutional description and use cases.
02Expected credential volume and user profiles.
03Required standards and minimum accepted versions.
04Mandatory and desirable integrations.
05Security, privacy, and hosting requirements.
06Support requirements, languages, and implementation timelines.
07Mandatory evidence to present.
08Pilot format and acceptance criteria.
09Required pricing model and full breakdown.
10Minimum export and exit conditions.

Annex C · Standards and frameworks references

Official sources consulted.

01

1EdTech Consortium — Open Badges. Official standard page and implementation resources.

www.1edtech.org/standards/open-badges

02

1EdTech — Open Badges Certification Process. Official conformance certification process.

www.1edtech.org/certification/open-badges

03

1EdTech — TrustEd Apps Program. Official directory of products with verified certification/interoperability.

www.1edtech.org/program/trustedapps

04

W3C — Verifiable Credentials Data Model v2.0. Official recommendation published May 15, 2025.

www.w3.org/TR/vc-data-model-2.0

05

Europass — European Digital Credentials. Official infrastructure for creating, issuing, storing, sharing, and verifying European digital credentials.

europass.europa.eu/en/european-digital-credentials

06

Europass — EDC for Learning. Official definition of European digital credentials for learning.

europass.europa.eu/en/european-digital-credentials-learning

07

Europass — Information for Developers. Technical documentation for the Europass ecosystem.

europass.europa.eu/en/information-developers

08

Europass — European Learning Model (ELM) Browser. Official European data model for learning and credentials.

europa.eu/europass/elm-browser/index.html

09

Europass — Latest developments to the European Learning Model.

europass.europa.eu/en/news/latest-developments-european-learning-model

10

OWASP — Application Security Verification Standard (ASVS).

owasp.org/www-project-application-security-verification-standard

11

OWASP — Authentication Cheat Sheet.

cheatsheetseries.owasp.org

12

NIST — SP 800-218 Secure Software Development Framework (SSDF).

csrc.nist.gov/publications/detail/sp/800-218/final

13

EUR-Lex — Regulation (EU) 2016/679 (GDPR).

eur-lex.europa.eu/eli/reg/2016/679/oj

14

Etherscan Docs.

docs.etherscan.io

References consolidated from official sources of 1EdTech, W3C, and OWASP reviewed in March 2026. Always verify the validity of certifications, standard versions, and technical evidence directly from official sources at the time of purchase.

Don't buy declarations.

Buy evidence.

merahki.ai · Complete Guide · v1.0 · April 2026

Get Started

Is your institution evaluating digital credentials platforms?

The merahki.ai team can help you apply this evaluation framework and find the right solution for your use cases.

30-min personalized walkthrough

A tailored demo of the platform matched to your specific use case.

Talk to an expert, not a sales rep

You'll speak with someone who deeply understands education-led growth.

Implementation roadmap included

Walk away with a clear plan for launching your first program.

Used by teams in 8+ industries

From healthcare to SaaS — we've seen and solved your challenges.

Free Report

The ELG Report — how to turn education into your #1 growth channel

Strategies, frameworks, and real data on Education-Led Growth to scale your academy and turn education into measurable revenue.

Trusted by the world's leading software review platforms, all merahki.ai and partners ecosystem solutions meet the highest industry standards.

Teal Spring Badge
Best Support
Capterra Best Value
Capterra Shortlist
GetApp Leaders
High Performer
Regional Leader
Software Advice Best Customer Support
Users Most Likely to Recommend
1EdTech
Europass
ISO 27001
LATAM EdTech
Teal Spring Badge
Best Support
Capterra Best Value
Capterra Shortlist
GetApp Leaders
High Performer
Regional Leader
Software Advice Best Customer Support
Users Most Likely to Recommend
1EdTech
Europass
ISO 27001
LATAM EdTech