Ahmedabad students’ Derma Vision app flags 50 cases in Women’s Day screening — a model for school-led AI in preventive dermatology

Table of Contents

  1. Key Highlights:
  2. Introduction
  3. How Derma Vision works: the basics of computer-vision screening
  4. The Women’s Day screening: scale, outcomes and immediate impact
  5. Recognition and credibility: the Vigyantram award at IIT Delhi
  6. A clinician’s perspective: technology as a first line, not a final word
  7. Technical challenges and limits: accuracy, bias, and uncertainty
  8. Data protection, consent and ethical safeguards
  9. Validation and regulatory pathway: what it takes to move from prototype to medical tool
  10. Minimizing harm: managing false positives, false negatives and anxiety
  11. Comparable initiatives and lessons from existing digital dermatology tools
  12. Practical roadmap: what the Derma Vision team should prioritize next
  13. Education, innovation and the role of youth-led projects in public health
  14. Integration with health systems: teledermatology and referral networks
  15. Potential social benefits and risks in low-resource settings
  16. The students behind Derma Vision: names, roles and the learning journey
  17. Scaling responsibly: scenarios for broader deployment
  18. Funding, partnerships and sustainability
  19. Where Derma Vision fits in the larger picture of AI in healthcare
  20. Conclusion (implicit)
  21. FAQ

Key Highlights:

  • Middle-school students at Zebar School in Ahmedabad developed Derma Vision, a computer-vision app that scanned more than 450 women at a community event and flagged roughly 50 participants for dermatology follow-up.
  • The project won Best Innovative Idea in Artificial Intelligence at the Vigyantram National Championship, IIT Delhi, and now faces the next phase: clinical validation, ethics safeguards, and pathway to real-world deployment.

Introduction

A group of students in Classes 7 and 8 at Zebar School, Ahmedabad, turned a classroom project into a functional public-health tool. Derma Vision, an application that uses computer vision to review visible skin patterns and give simple guidance, was deployed during a local Women’s Day awareness programme where it screened more than 450 participants. The app identified about 50 people whose skin patterns merited professional dermatological assessment. The exercise combined community outreach, technology and early clinical triage; the team’s work also earned national recognition at a student AI championship hosted by the Indian Institute of Technology–Delhi.

The screening revealed two things at once: community-level demand for accessible skin-health checks, and the practical potential of lightweight AI tools to act as a first line of awareness. That potential does not remove the need for rigorous validation, ethical safeguards and careful integration with healthcare services. The next months will determine whether Derma Vision evolves from a promising prototype into a safe, effective screening aid.

How Derma Vision works: the basics of computer-vision screening

Derma Vision applies computer-vision techniques to photographs of the skin to detect visible patterns that may indicate dermatological conditions. At a functional level the app performs three tasks:

  1. Image capture and preprocessing — acquiring a clear, well-lit photo, normalizing color and scaling, and removing artefacts that could confuse downstream analysis.
  2. Pattern recognition — using trained models to identify features such as asymmetry, irregular borders, abnormal pigmentation, scaling, or lesion shape and size.
  3. Guidance and triage — presenting results in a simple format and recommending either basic home care or referral to a dermatologist when the model detects patterns associated with conditions warranting professional assessment.

Computer vision models for skin screening typically rely on convolutional neural networks (CNNs) or more recent transformer-based image architectures. These models learn correlations between visual features and labeled clinical outcomes. For a community screening tool, the output is not a definitive diagnosis but a risk classification: no concern, observe and follow preventive steps, or seek dermatological consultation.

Derma Vision’s user flow — as described by the school — appears designed for large-group screening: attendees receive an AI-based skin scan, a readable result, and guidance. The students added basic care tips and home remedies for common, non-urgent issues, reserving specialist referral for higher-risk cases. That triage approach reduces unnecessary clinic visits while ensuring those with potentially serious signs are directed to care.

The Women’s Day screening: scale, outcomes and immediate impact

The screening took place during a Women’s Day programme attended by faculty, administrative and ground staff affiliated with Udgamverse schools. More than 450 women participated. Each participant underwent an AI skin scan using Derma Vision, and around 50 were advised to seek a dermatology consultation.

Those figures signal an active screening yield: roughly one in nine participants showed a visible pattern the app flagged for specialist attention. From a public-health perspective, that is a meaningful rate of referral at a single community event. The school described the results as demonstrating early-detection value and clear community usefulness.

Immediate benefits of such a programme include raising awareness about skin health, prompting follow-up for conditions that might otherwise go unnoticed, and reducing barriers to care for those who would not have sought dermatology assessment proactively. For participants who receive non-urgent guidance, the app can provide practical steps to manage minor issues safely at home.

At the same time, the screening outcome raises questions about follow-up, diagnostic confirmation, and the potential for false positives or anxiety induced by automated screening. The school has acknowledged the need for clinical validation and plans to consult dermatologists and build stronger ethical and clinical safeguards.

Recognition and credibility: the Vigyantram award at IIT Delhi

Derma Vision won the Best Innovative Idea in Artificial Intelligence award at the Vigyantram National Championship, 2026, held at IIT Delhi. The event attracted student teams from 49 schools across India, positioning the project on a national stage.

Awards like Vigyantram confer several non-trivial benefits for student projects. They validate technical novelty, improve visibility to potential collaborators and mentors, and open doors to institutional support. For Derma Vision, recognition at a premier technical institute signals that the concept has passed an initial creativity and feasibility filter. The next test involves clinical and ethical rigor: reproducibility of results, accuracy under diverse conditions, and safe handling of participant data.

Winning a national contest does not replace healthcare validation, but it can accelerate partnerships. The Zebar School team has indicated plans to collaborate with dermatologists to validate their model and to strengthen safeguards. Those collaborations will be essential if the app is to be used routinely in community health programmes.

A clinician’s perspective: technology as a first line, not a final word

A consulting dermatologist associated with the initiative emphasized the role of early awareness. Technology-enabled screening tools like Derma Vision can provide first-level guidance that encourages timely specialist consultation when needed, while helping others manage minor concerns at home.

Dermatology is a visually driven specialty. Many conditions present with distinctive surface changes that are visible to the naked eye or via a camera. That visibility makes the specialty particularly amenable to image-based screening and teledermatology. Yet skin conditions also vary with lighting, skin tone, lesion location, and image quality. A human clinician integrates history, symptom chronology and tactile findings—information an image alone cannot fully convey.

A responsible clinical approach treats AI-generated alerts as referral prompts, not diagnoses. The app’s current model, which suggests professional consultation only when necessary, aligns with that thinking. The next step is to quantify how often those prompts reflect true positives and false positives, and to measure whether the app improves clinical outcomes when combined with structured follow-up.

Technical challenges and limits: accuracy, bias, and uncertainty

Image-based skin screening faces notable technical hurdles. Any deployment plan must address three core issues: model accuracy, dataset representativeness, and the risk of bias.

  • Accuracy and metrics: Diagnostic performance depends on sensitivity (ability to flag true cases) and specificity (ability to avoid false alarms). For screening, higher sensitivity is often prioritized to avoid missing concerning signs, but excessive sensitivity increases false positives. Without published validation metrics (sensitivity, specificity, positive predictive value, negative predictive value) the app’s performance remains anecdotal. The screening event’s referrals provide an initial signal but not a substitute for controlled evaluation.
  • Dataset diversity: Skin appearance varies across pigmentations, ages and body sites. Models trained on datasets skewed to lighter skin tones or specific lesion types can underperform on underrepresented groups. Screening populations in India include diverse skin tones; the training data must reflect that diversity to avoid systematic bias.
  • Image quality and standardization: Ambient lighting, camera resolution and positioning affect feature visibility. Community screening environments are often non-ideal: varying light, movement, and differing device cameras. Robust preprocessing and clear capture protocols can reduce noise, but some variability is unavoidable.
  • Confounding factors: Skin manifestations can mimic multiple conditions. A lesion that looks suspicious on camera may be a benign keratosis, a healed scar, or a pigmentary change unrelated to malignancy. Conversely, early-stage conditions may lack dramatic visual signatures.

Addressing these technical points requires staged studies: retrospective testing on labeled datasets, prospective pilot cohorts with clinical adjudication, and randomized or controlled deployments to understand downstream outcomes.

Data protection, consent and ethical safeguards

Any programme that collects skin images carries privacy and dignity implications. A responsible deployment must include:

  • Informed consent: Clear explanation of the app’s purpose, limits and what will happen to images and data. Participants should know whether images are stored, how long they are kept, who can access them and whether they will be used for model improvement.
  • Data minimization and anonymization: Store only what is necessary. When possible, strip metadata and apply techniques to anonymize facial or identifying features. For dermatologic images, careful cropping and de-identification reduce identifiability.
  • Secure storage and access control: Use encryption in transit and at rest. Limit access to authorized personnel and log access for auditability.
  • Regulatory compliance: Follow national guidelines for health data. In India, digital health initiatives operate alongside frameworks such as the Ayushman Bharat Digital Mission and with reference to regulations that govern medical devices and patient data. Projects should consult institutional review boards (IRBs) or ethics committees before broad deployment.
  • Special protections for minors: The app was developed by school students and screened adult participants, but any future extension to minors must obtain appropriate parental/guardian consent and adhere to child-protection regulations.

Zebar School has acknowledged plans to improve the app’s ethical and clinical safeguards. Partnering with dermatologists and institutional oversight will be essential to ensure participant rights are protected.

Validation and regulatory pathway: what it takes to move from prototype to medical tool

Turning a classroom-built AI prototype into a clinically acceptable screening tool requires a disciplined validation and regulatory approach.

  1. Clinical validation studies
    • Retrospective validation: Test the model on labeled datasets with confirmed clinical diagnoses. Compute sensitivity, specificity and predictive values across lesion types and skin tones.
    • Prospective validation: Deploy the app in controlled settings where every flagged and non-flagged case undergoes dermatologist evaluation to quantify real-world performance.
    • Interobserver comparison: Compare AI classifications against multiple dermatologists to understand variability and where the model agrees or diverges.
  2. Usability and implementation testing
    • Evaluate how non-technical users interact with the app: clarity of instructions, capture reliability, and comprehension of guidance.
    • Test workflows for referral and follow-up, ensuring that flagged participants can access dermatology services.
  3. Regulatory classification
    • Determine whether the app constitutes a Software as a Medical Device (SaMD) under national rules. SaMD that provides clinical decision support is often subject to stricter oversight.
    • Prepare documentation: design controls, clinical evidence, risk analysis, post-market surveillance plans.
  4. Institutional approvals
    • Obtain ethics approvals for studies, especially when images and personal health data are involved.
    • Engage local health authorities or hospital partners when piloting in community settings.
  5. Post-market monitoring
    • If deployed, implement mechanisms for reporting adverse outcomes, monitoring false negatives, and updating the model responsibly.

India’s regulatory environment for medical devices and digital health has matured in recent years, but pathways for AI-driven SaMD require careful navigation. Collaborations with established clinical partners and institutional sponsors will smooth the path.

Minimizing harm: managing false positives, false negatives and anxiety

Automated screening can do harm if not handled correctly. False positives may cause unnecessary anxiety and overload specialist services; false negatives risk delayed diagnosis. Practical measures to reduce harm include:

  • Conservative messaging: Frame results as “suggestive” rather than definitive. Provide clear next steps: watchful waiting, home care instructions, or referral.
  • Triage thresholds: Adjust model sensitivity to the screening context. In high-prevalence or low-access settings, raising sensitivity may be appropriate; in other contexts, prioritize specificity.
  • Human-in-the-loop workflows: Route flagged cases to teledermatology review by clinicians before formal referral. This intermediary step filters false positives and reduces unnecessary clinic visits.
  • Education and counselling: Provide participants with accessible explanations about common skin conditions, expected next steps and where to get help.
  • Follow-up mechanisms: Record whether recommended referrals are acted upon and whether clinical diagnoses confirm the initial alert. Use that feedback to refine the model.

Those measures require system-level coordination — training non-clinical staff who operate the screening station, establishing telemedicine links to dermatologists, and ensuring participants understand the limits of the tool.

Comparable initiatives and lessons from existing digital dermatology tools

Derma Vision’s trajectory mirrors several earlier projects that applied AI to dermatology. Established examples include mobile apps and teledermatology services designed to detect suspicious moles or triage common skin conditions. Common lessons emerge:

  • Early clinical validation matters. Commercial apps that publish validation data and regulatory clearances gain user trust and clinical uptake.
  • Model transparency and documentation foster adoption. When developers publish details on training data composition and model limitations, clinicians can better assess fit-for-purpose.
  • Human oversight reduces risk. Hybrid models that combine automated analysis with remote clinician review perform better in practice.
  • Equity requires representative datasets. Projects that invest in diverse training data reduce the risk of performance disparities.

Derma Vision benefits from being deployed initially in a community screening context under school oversight. That setting allows the team to gather real-world feedback and iterate with clinician partners before scaling.

Practical roadmap: what the Derma Vision team should prioritize next

The students and their mentors face a concrete set of tasks to translate pilot success into safe, scalable impact. Recommended priorities:

  1. Clinical partnership: Formalize agreements with dermatologists or dermatology departments for prospective validation and annotation of cases.
  2. Ethics oversight: Seek institutional review for studies and implement comprehensive informed-consent processes for image collection and model improvement.
  3. Data governance: Build secure storage, anonymization pipelines, and clear retention policies. Publish a privacy statement and user-facing terms.
  4. Validation plan: Design retrospective and prospective studies, predefine success metrics and plan statistical analyses.
  5. Usability testing: Pilot capture workflows in diverse lighting and device conditions; refine instructions and UI/UX for clarity.
  6. Human-in-the-loop triage: Establish a teledermatology review tier for flagged cases as a safety net before formal referrals.
  7. Transparent communication: Publish validation results and limitations to avoid overclaiming clinical capability.
  8. Regulatory consultation: Engage with regulatory authorities or consultants to map the pathway for SaMD classification if clinical use is intended.

Pursuing these steps will convert an impressive student innovation into a responsibly governed health screening tool.

Education, innovation and the role of youth-led projects in public health

Projects like Derma Vision illustrate how educational institutions can channel curiosity into community benefit. When young learners tackle real problems—from screening fellow citizens to competing nationally—they develop technical skills and civic responsibility. The process also shows how mentorship from clinicians and engineers accelerates learning and improves outcomes.

Youth-led initiatives have advantages: they bring fresh thinking, cost-effective solutions and community trust. To translate such projects into sustainable interventions, schools and mentors should connect teams with clinical partners, legal counsel and funding channels. Structured incubators within education ecosystems can guide teams through ethics, validation and scale-up.

Examples of successful student projects often share a common pattern: a clear local need, iterative prototyping with stakeholder feedback, mentorship and staged validation. Derma Vision’s Women’s Day screening fits that pattern: local deployment provided immediate feedback, while national recognition opens doors for mentorship and support.

Integration with health systems: teledermatology and referral networks

For Derma Vision to deliver measurable health impact, it must link screening outputs to care pathways. Teledermatology and referral networks provide the necessary bridge.

  • Teledermatology: Remote consultation with a dermatologist can triage flagged participants efficiently. Many dermatologists can provide visual assessment and guide whether in-person visits are necessary.
  • Referral mapping: Establish which local clinics or hospitals accept referrals and whether financial or logistic support is needed for participants.
  • Follow-up tracking: Integrate simple follow-up mechanisms (SMS/phone calls) to confirm whether participants sought care and what the outcome was. That data closes the loop on effectiveness.
  • Training community health workers: Equip front-line workers with capture protocols and the knowledge to explain results and next steps.

Digital tools are most effective when embedded in human systems. Screening without referral and follow-up risks becoming an isolated exercise with limited benefit.

Potential social benefits and risks in low-resource settings

Accessible screening tools can lower barriers to recognizing treatable skin conditions, particularly among populations with limited access to specialist care. Detecting treatable chronic conditions, infections, or suspicious lesions early improves outcomes and reduces the burden of advanced disease.

However, risks exist. Unvalidated tools can generate fear, overload clinics with false positives, or miss serious conditions due to dataset limitations. Careful deployment with clinician partnerships and staged validation minimizes these risks. For community-based projects run by schools, aligning with local health authorities increases legitimacy and ensures appropriate resource allocation for follow-up care.

The students behind Derma Vision: names, roles and the learning journey

Derma Vision was developed by Dishen Gadhiya, Hetansh Patel, Yug Dalsania, and Janmesh Darji—students from Classes 7 and 8 at Zebar School. Their work combined software development, data handling and community engagement.

Projects at this level teach technical skills—coding, model training and UI design—alongside soft skills like ethics, communication and collaboration. The students’ decision to include basic care guidance and to route only certain cases for specialist review shows early attention to responsible triage. Their next challenge will be to work with clinicians and data stewards to translate hands-on learning into reproducible, safe practices.

Scaling responsibly: scenarios for broader deployment

If clinical validation supports Derma Vision’s accuracy and safety, possible deployment scenarios include:

  • School health camps: Periodic screenings for students, staff and families, integrated with school health services.
  • Community health drives: Partner with local NGOs and municipal health authorities to run screenings at community centres and festivals.
  • Primary-care augmentation: Provide general practitioners and health workers with a triage tool to flag cases needing dermatology referral.
  • Teletriage hub: Centralize dermatologist review of flagged cases to reduce unnecessary in-person referrals.

For each scenario, developers must tailor model thresholds, capture protocols and referral workflows. Local pilots with outcome tracking are essential before wider rollout.

Funding, partnerships and sustainability

Sustaining a public-health screening tool requires funding, institutional partnerships and a governance model. Potential partners include:

  • Academic hospitals: Clinical validation, mentorship and teledermatology support.
  • Public health departments: Facilitate community deployments and integrate screenings with broader health programmes.
  • NGOs and foundations: Provide funding for pilot projects, training and equipment.
  • Technology incubators: Help with product development, regulatory planning and scaling.

A sustainability plan should consider device and hosting costs, clinician time for teletriage, data storage and updates to the model as more annotated data becomes available.

Where Derma Vision fits in the larger picture of AI in healthcare

Derma Vision exemplifies a growing trend: local innovators applying AI to narrow, well-defined screening tasks with clear user interfaces. When deployed thoughtfully, such tools can expand preventive care, especially in settings where specialist access is limited. They also test the boundaries of how non-clinical teams can contribute to healthcare innovation.

The broader challenge for AI in healthcare remains constant: combining technical performance with clinical evidence, transparent reporting and robust governance. Tools that achieve that combination will operate as trusted assistants in the clinician’s toolkit rather than as standalone diagnosticians.

Conclusion (implicit)

Derma Vision’s emergence from a school project to a nationally awarded prototype, and its real-world screening of hundreds of participants, suggests both the promise and complexity of AI-driven preventive health tools. The app provides a functioning model of how computer vision can prompt early action on skin problems. The team’s next steps—clinician partnerships, ethical safeguards, robust validation and pathway planning—will determine whether the app becomes a safe, scalable public-health instrument or remains an instructive prototype.

That pathway embodies the broader imperative of translating youthful ingenuity into responsibly governed health technologies: technical creativity must be matched by clinical evidence and ethical stewardship.

FAQ

Q: Is Derma Vision a diagnostic tool? A: No. Derma Vision is a screening and awareness tool that analyzes visible skin patterns and provides guidance. Its output is intended to prompt either self-care or consultation with a dermatologist. Formal diagnosis requires a clinical assessment.

Q: How many people were screened and how many were referred? A: The app was used during a Women’s Day programme where more than 450 participants were scanned. Around 50 participants were advised to seek dermatology consultation based on the app’s screening.

Q: Who developed Derma Vision? A: The app was developed by four students at Zebar School, Ahmedabad: Dishen Gadhiya, Hetansh Patel, Yug Dalsania, and Janmesh Darji, who are in Classes 7 and 8.

Q: Has the app been clinically validated? A: The project has completed a community screening and received positive attention, including an award. The developers have stated plans to collaborate with dermatologists for formal validation and to strengthen ethical and clinical safeguards. Controlled validation studies remain necessary to quantify accuracy and safety.

Q: What kinds of skin conditions can the app detect? A: The app analyzes visible skin patterns that may be associated with a range of dermatological issues. It is designed to flag patterns that warrant specialist review. The app is not a substitute for clinical evaluation and cannot definitively identify all conditions from images alone.

Q: How will participant privacy be protected? A: The school has indicated plans to improve ethical safeguards. Best practices for privacy include informed consent, anonymization, secure storage, limited retention and transparent policies about data use. Any future deployment should make these protections explicit and follow local regulations.

Q: Could the app produce false positives or negatives? A: Yes. Image-based screening carries the risk of both false positives (flagging benign findings) and false negatives (missing concerning lesions). Human-in-the-loop workflows and prospective validation studies help quantify and reduce these errors.

Q: What are the next steps for the Derma Vision team? A: Priorities include clinical validation with dermatologists, ethics approvals, usability studies, secure data governance, establishing teledermatology review pathways, and consulting regulatory frameworks if clinical deployment is intended.

Q: Can Derma Vision replace dermatologists? A: No. The tool is intended as an aid to awareness and triage. Dermatologists provide diagnostic expertise, biopsies, treatment decisions and longitudinal care that an image-based app cannot replace.

Q: What does winning the Vigyantram award mean for the project? A: The award at the Vigyantram National Championship, hosted by IIT Delhi, signals technical recognition and improves visibility. It can open doors to mentorship, clinical partnerships and potential funding, but it does not replace clinical validation or regulatory approval.