Registration for TECHNA 2019 is now full and any further invitees will be placed on a waitlist. This is a by invitation only event. If you have not received an invitation and would like to attend, please contact Meera at firstname.lastname@example.org
101 College Street . Toronto . Ontario
Schedule & Speaker Bios
Below is the event schedule for TECHNA 2019: Human/Machine Interfaces. Here you can view the agenda and speaker bios by clicking the event times below. Video recordings of individual speeches will be available on this page after the event within each segment. The TECHNA 2019 Symposium will feature a live tweeting system for participants to add comments in parallel to the discussions and presentations.
8:15-9:00 – Breakfast and Registration
9:00-9:15 – Opening Remarks
Senior Director, TECHNA and Diagnostic Innovations, UHN
Dr. Luke Brzozowski leads TECHNA’s Technology Development Team, comprising over fifty technical project managers, engineers, software developers, as well as operational, quality, regulatory, marketing, and financial professionals who lead, manage, and carry out health technology productization programs and projects in a hospital environment.
Prior to joining TECHNA at its inception (then as Director of Operations and Engineering), Luke held management positions in the development, marketing, legal, and regulatory departments in the health technology industry and pharmaceutical CRO. Luke is the recipient of the 2003 Governors General’s Gold Medal for his Doctorate.
9:15-10:00 – Keynote: Reimagining Work in the Age of AI
Managing Director of Information Technology and Business Research, Accenture Research
H. James Wilson is managing director of IT and Business Research at Accenture Research, where he leads global research programs on the impact of AI on work. Wilson is co-author of the best-selling, Human + Machine: Reimagining Work in the Age of AI (Harvard Business Review Press). He is author or contributing author of eight books on the impact of technology on work and society, including most recently, AI, Analytics, & The New Machine Age (HBR Press 2019) and How to Go Digital (MIT Press 2019).
Wilson wrote “The Jobs Artificial Intelligence Will Create,” MIT Sloan Management Review’s #1 Most-Read article of the year, and is a longtime contributor to The Wall Street Journal and HBR. His latest HBR article is, “The Future of AI Will Be About Less Data, Not More.”
Abstract: This talk will look at the unprecedented new jobs, work processes, and skills required for working with artificial intelligence systems. H. James Wilson, co-author of the best-selling book, Human + Machine: Reimagining Work in the Age of AI (Harvard Business Press), will draw on significant global research as well as case studies to describe the actions that leaders must take to responsibly start and shape an AI-focused initiative inside your organization.
10:00-10:45 – Post-Keynote Panel
Chief Digital & Technology Officer, MD Anderson Cancer Centre
Dr. David Jaffray graduated from the University of Alberta with a B.Sc. in Physics (Hons.) in 1988 and completed his Ph.D. in the Department of Medical Biophysics at the University of Western Ontario in 1994. Following graduation, he took a position as Staff Physicist in the Department of Radiation Oncology at William Beaumont Hospital in Michigan where he instigated a direction of research that garnered funding from the NIH and from congressionally-directed funding programs. Dr. Jaffray became a Board Certified Medical Physicist (ABMP – Radiation Oncology) in 1999.
In 2002, Dr. Jaffray joined the Princess Margaret Hospital in Toronto as Head of Radiation Physics and a Senior Scientist within the Ontario Cancer Institute. David holds the Fidani Chair in Radiation Physics, is the Director of the TECHNA Institute for Health Technology Development at the University Health Network and recently became the Executive Vice President of Technology and Innovation at the University Health Network. He is a Professor in the Departments of Radiation Oncology, Medical Biophysics, and Institute for Biomaterials and Biomedical Engineering at the University of Toronto. His primary area of research has been in the development and application of image-guided therapy. He has over 5 patents issued and several licensed, including, kilovoltage cone-beam computed tomography for image-guided radiation therapy. Dr. Jaffray has >200 peer-reviewed publications in the field, >100 invited lectures, and holds numerous peer-review and industry sponsored research grants. He sits on numerous scientific and research boards and has contributed to the NIH and CIHR grant review process for several years. He is an active member of the AAPM and teaching role in workshops and annual meeting of the American Society of Therapeutic Radiation Oncology (ASTRO). He has an active interest in commercialization and led the development of a variety of commercial products including software and hardware for QA and the development of small animal irradiator systems for basic research. He has successfully supervised over 20 graduate students and fellows.
Dr. Jaffray has won each of the major prizes in the field of the medical physics, including, the Sylvia Sorkin-Greenfield Award, The Farrington Daniels Award, and the Sylvia Fedoruk Award. In 2004, Dr. Jaffray was identified as one of Canada’s Top 40 Under 40 and was recognized by The University of Western Ontario with their Young Alumni Award in 2004. His current research interests focus on the development of imaging technologies and methods with a focus on image-guided interventions, including radiation therapy, drug delivery, and surgery.
Jay Vidyarthi is a strategic design consultant who can help you put the people you serve at the heart of your design process. His vision is to have an exponential positive impact by empowering mission-driven organizations, building their capacity for human-centered design. He currently serves as a coach and collaborator for a number of teams working to bring more mindfulness, mental health, and well-being into the world.
Jay led UX to startup Muse: the brain-sensing headband, a category leader in transformative technology. He’s also done over a decade of UX / UI / PM work for Fortune 500, scrappy startups, hospitals, governments, and nonprofits. Top-tier press and influencer lists have consistently recognized Jay’s work (Forbes, TransTech200, Fast Company…), and he’s delivered 40+ speaking/teaching engagements around the world (MIT, Harvard, CHI, DIS…).
As the inventor of Sonic Cradle, Jay authored a number of early influential academic publications about mindful technology. He also helped create the annual A Mindful Society conference which draws hundreds of leaders bringing mindfulness and compassion into healthcare, education, government, tech, and organizations. As an attention activist, he is passionate about resisting the exploitation of our attention through innovation, regulation, ethics, leadership, education and mindfulness.
Executive Director, Surgical Services, UHN
Terri Stuart-McEwan is the Executive Director of Surgical Services at UHN and is an Adjunct professor at the Bloomberg School of Nursing at the University of Toronto. She trained as a Registered Nurse, where she graduated in 1985 from Ryerson University and University of Saskatchewan. She completed her Masters in Health Leadership in 2005.
Over the last 25 years Terri has been in progressive leadership roles across Canada including Saskatchewan, Nova Scotia and Ontario. These experiences include Clinical Director in 3 Ontario hospitals providing operational, financial and quality oversight in surgical and oncology programs. She has been the Cancer Care Ontario regional LHIN Lead for both academic and community hospitals in surgery, oncology, endoscopy.
Over the last 10 years at UHN, her accomplishments include Executive Director for Surgical Services across Toronto General, Toronto Western and Princess Margaret and Executive Director at Princess Margaret for the Solid tumour oncology programs. She has had extensive experience and collaboration with strategic assessment, partnership engagement and collaboration with MOHLTC and programs across Ontario, Canada and international partnerships with Kuwait and Qatar.
She actively participates in academic and research with a focus on leadership and implementation science in clinical practice. In addition, she is the co-chair for the UHN Health Information System (HIS) clinical implementation committee as UHN embarks on this massive change in the next 3 years.
Patient Partner, UHN
In November 2014 I sustained a C5 C6 spinal cord injury. Prior to that I worked as a commercial industrial electrician. Since then my life has done a complete 180° turn. All the things I thought mattered, really didn’t matter. I now had to face a whole new world of actual problems, adaptations, and just simply surviving day to day, literally minute to minute.
My journey has taken me through the full spectrum of the healthcare system from the 911 call, emergency decompression surgery, ICU, inpatient, outpatient, rehab, transitional housing, and finally living independently in the community.
As my recovery began to progress, I began to look into how I could re-engage myself with life. Advocacy quickly became a natural and easy fit for me.. Now 4 years out I am a Patient Partner with UHN.
I have been involved in work with the patient portal, HAC’s, imaging speaking panels, the AODA council and many more.
I am indebted to the people at UHN who saved my life after my accident and to the current staff that have helped me give my new life meaning purpose and dignity.
It is both my pleasure and honour to give back to my community, collaborate with UHN staff, and be a Patient Partner.
Team Lead, Toronto Video Atlas project, UHN
Albert graduated from the University of Toronto with a Masters degree in Biomedical Communications in 2010.
He is currently serving as Team Lead of the Toronto Video Atlas project, headed by Dr. Ian McGilvray at the Toronto General Hospital.
The Video Atlas project aims to deliver surgical education to trainees worldwide through innovative, cutting edge technology and media, including high definition surgical footage, 3D computer animation, and most recently augmented and virtual reality.
Albert has taken an active role in developing experimental virtual reality modules for use in surgical education, as well as surgical planning. A few select modules can be viewed in detail at http://tvasurg.ca/blog
Surgical Scientist, UHN
Dr. Taymaa May received her medical degree from McGill University in Montreal and went on to complete post-graduate training in Obstetrics and Gynecology at the University of Toronto. During her residency, Dr. May completed the Clinician Investigator Program and received a Master’s of Science degree from the Institute of Medical Sciences at the University of Toronto. Following residency, Dr. Taymaa May completed a 3-year fellowship in Gynecologic Oncology at Brigham and Women’s Hospital/ Dana Farber Cancer Institute, Harvard Medical School in Boston, MA.
Dr. Taymaa May is a surgical scientist at the University Health Network and an Assistant Professor at the University of Toronto. Dr. May has clinical training in the medical management of women with gynecological malignancies and surgical training in complex oncologic surgeries including pelvic, abdominal, laparoscopic and robotic surgery. Her research interests are in translational science and clinical trials in the field of ovarian cancer.
10:45-11:00 – Break
11:00-12:00 – AI and Sensory Technologies
Interim Radiologist-in-Chief, Joint Department of Medical Imaging
Dr. Schmidt is Professor of Radiology at the University of Toronto, and the interim radiologist-in-chief of the Joint Department of Medical Imaging (JDMI) at the University Health Network, Sinai Health System and Women’s College Hospital.
She trained in Germany, where she graduated from Medical School in 1988, and finished her residency as a diagnostic radiologist in 1993. Following a research fellowship at the University of California, San Francisco (UCSF) in 1994, Dr. Schmidt went back to Germany, and returned to North America in 1997, where she remained on the faculty of the Department of Radiology at UCSF until she moved to Toronto in 2002. Since then, Dr. Schmidt is a staff radiologist in the cardiothoracic division in the JDMI.
For more than 10 years, from 2007-2018, Dr. Schmidt was the Site Director for Medical Imaging at Women’s College Hospital. For 6 years, she was also the head of the chest section in JDMI, before she became head of the cardiothoracic division in March 2016. In February this year she was appointed as interim radiologist-in-chief.
In 2003, Dr. Schmidt started the Lung Cancer Screening Study at PMH as part of the International Early Lung Cancer Action Program (I-ELCAP), and she is the local Principal Investigator of the Pan-Canadian Early Lung Cancer Detection Study. More than 5000 individuals have been screened under her supervision in those research studies. Since early 2017 she is the Radiology Quality Lead of the High Risk Lung Cancer Screening at Cancer Care Ontario.
Alexandre Le Bouthillier holds a PhD in Operations Research from the University of Montreal and completed a post doctorate program at the University of Geneva. He is the co-founder of Planora, an artificial intelligence & optimisation company. Alexandre was VP Science & Technology at RedPrairie (now JDA Software) following the acquisition of Planora.
Financial angel, he invests talent and capital in technological ventures. He has received scholarships and other academic awards from government research funding agencies, corporations and professional associations. His research and work interests include operations research, artificial intelligence (machine learning, deep learning), healthcare, oncology, brain, big data & cloud infrastructure.
He is cofounder of Imagia, an artificial intelligence driven company in the fight against cancer, board member at the MILA (Quebec Institute of Machine Learning headed by Prof. Yoshua Bengio), board member of IVADO, MEDTEQ and Montreal InVivo.
Abstract: The future of medicine and the hope for conquering the complexity of cancer and other high burden disease is in the intelligent use of digital information (images, genetic, treatment, outcome & real world evidences). Advances in Artificial Intelligence (AI) will uniquely contribute to unlocking data-driven discoveries.
Leveraging advances in AI and accelerating the advent of accessible personalized medicine, a new digital and discovery health platform using distributed data is discussed. This AI-first platform for clinician drive end-to-end discoveries of AI biomarkers with the greatest chance of clinical benefit for patients.
Three types of AI digital biomarkers with examples will be covered during this presentation.
Dr. Cesar Marquez-Chin is a scientist at KITE, Toronto Rehabilitation Institute – University Health Network and an Assistant Professor at the Institute of Biomaterials and Biomedical Engineering, University of Toronto. He has dedicated his entire professional career to the rehabilitation of individuals with disabilities. His research interests include assistive technologies and neurorehabilitation after spinal cord injury and stroke. Central to his activities is the development of systems that can use brain activity to control electronic devices. These brain-computer interfaces promise to become a valuable tool for restoring voluntary movement after paralysis. His recent work includes combining brain-computer interfaces and functional electrical stimulation therapy, a technique that uses electrical discharges to produce functional movements in paralyzed muscles. This combination of technologies has been successful in restoring the ability to reach and grasp after stroke and SCI, even when other forms of therapy have had limited efficacy. Dr. Marquez-Chin holds a doctorate degree from the Institute of Biomaterials and Biomedical Engineering, University of Toronto.
Abstract: Stroke and spinal cord injury often result in paralysis which can decrease independence and ultimately affect the quality of life. The last few decades have seen the development of important new interventions to help restore the ability to move voluntarily. Today, clinicians have an unprecedented number of tools at their disposal with new therapies and technologies being created regularly. One important intervention is functional electrical stimulation therapy, in which patients attempt functional movements repeatedly. Simultaneously, a therapist triggers trains of electrical pulses that are applied to paralyzed muscles selected specifically to produce the intended movement. This talk will describe a new version of this therapy which integrates brain-computer interfacing technology. In this new approach the stimulation is triggered by a person’s own brain activity indicative of the intention to move. Early results suggest that this new intervention can be effective in restoring voluntary motor function even in cases in which all other interventions have been unsuccessful.
Director of Artificial Intelligence, Surgical Safety Technologies Inc.
Frank Rudzicz is a scientist at the Li Ka Shing Knowledge Institute at St Michael’s Hospital, Director of Artificial Intelligence at Surgical Safety Technologies Inc., an associate professor of Computer Science at the University of Toronto, co-founder of WinterLight Labs Inc., faculty member at the Vector Institute for Artificial Intelligence, President of the international joint ACL/ISCA special interest group on Speech and Language Processing for Assistive Technologies, Inaugural Chair of the Standards Council of Canada’s subcommittee on Artificial Intelligence, and CIFAR Chair in Artificial Intelligence. He is the recent recipient of the Young Investigator award from the Alzheimer’s Society of Canada, the Early Researcher award from the Government of Ontario, the Excellence in Applied Research award from the National Speech-Language & Audiology Canada, and the Connaught Innovation Award.
His work is in machine learning in healthcare, especially in natural language processing, speech recognition, and surgical safety. His research has appeared in popular media such as Scientific American, Wired, CBC, and the Globe and Mail, and in scientific press such as Nature.
Abstract: In this talk, we take a path through several different approaches to explainability in machine learning. First, we talk about categories of explainability, then we discuss approaches to relevance ranking in terms of engineered features and in image data. We then provide a use case of a recent publication on using machine learning with MEG data, and briefly cover how explainable AI may help to overcome regulatory and cultural issues in healthcare and therefore accelerate the use of AI methods in practice.
12:00-1:15 – Lunch – AR Art Exhibit
ReBlink Pop-up Show, Impossible Things
1:15-2:15 – AR/VR Technologies
Surgeon, Unity Health Toronto
Dr. Blake Murphy is a Plastic & Reconstructive Surgeon at St. Michael’s Hospital in Toronto, Canada. Blake completed his undergraduate degree at Simon Fraser University and then his PhD in Medical Biophysics at the University of Western Ontario. Following this, he completed medical school at the University of Ottawa and Plastic Surgery residency at the University of Toronto. Following residency, he added a one year Craniofacial Fellowship in Miami, Florida and then a one year Microsurgery and Breast Reconstruction at University Health Network. His clinical practice focuses on trauma and oncology reconstruction, with particular focus on craniofacial surgery and breast reconstruction.
Blake’s research interest includes augmented reality applications in surgery and he is appointed as an Affiliate Scientist at TECHNA Institute, University Health Network. The focus of the research group is on the development of augmented reality applications for surgical education. In addition, we have an interest in augmented reality and its application to surgical planning including patient specific surgical plans. Beyond these applications, we are working toward a system that can be integrated into the clinical setting for intra-operative navigation and feedback.
Medical Physicist, Ottawa Hospital
Justin Sutherland is an Assistant Professor of Radiology at the University of Ottawa and a medical physicist at the Ottawa Hospital. He is also the founder of realizeLAB, a medical virtual reality research lab based at The Ottawa Hospital that focuses on exploring unique and creative ways that virtual and augmented reality (VR/AR) can be used to facilitate the jobs of medical professionals, particularly in the domain of medical image interaction. Justin’s innovations include the development of a comprehensive framework for using modern virtual and augmented reality technologies in the clinical setting, which have been recognized with awards from the Journal of Digital Imaging and numerous invited presentations at the annual meetings of the Radiological Society of North America and elsewhere.
Abstract: Medical image review has progressed through a few technological iterations – from films displayed on lightboxes to radiology workstations consisting of LCD monitors that can present a mix of 2D and volumetric images. Today, there is a growing number of virtual reality (VR) systems capable of creating immersive simulated virtual environments. When use correctly, and carefully, these VR technologies can improve the way clinicians interact with medical images.
This talk will begin by describing the distinguishing features and functions of various virtual and augmented reality technologies, stratifying them along a few key technology-defining characteristics. A comprehensive framework for applying these technologies to advanced visualization of medical images and models will then be presented and discussed, before examining the deceptively complex question of how to quantify and measure image quality within a virtual environment. Using these concepts, several past and present projects based in VR at The Ottawa Hospital will be presented with the aim of showing how this exciting technology has potential to change the way we interface with medical images.
Associate Director, PERK Lab/Queen’s University
Dr. Andras Lasso is associate director of Laboratory for Percutaneous Surgery at Queen’s University, specialized in translational research and system development for minimally invasive interventions. Andras is lead contributor of software packages including 3D Slicer and PLUS toolkit, which are used by academic groups and companies worldwide.
Prior to joining the lab at Queen’s University, he worked for 9 years for GE Healthcare as senior engineer, developing software components for Innova interventional X-ray systems and Advantage Workstation, and led development of advanced image visualization, quantification, real-time image fusion and guidance applications.
Abstract: Virtual reality (VR) systems allow experiencing dynamic 3D content in a fully immersive virtual environment, which has been shown to be useful in variety of medical applications. VR hardware is now easily accessible and very affordable, but software development can be challenging, since commonly used VR software frameworks are not well suited for medical applications. 3D Slicer, a widely used open-source medical image visualization and analysis application, has been extended with a new VR interface (www.slicervr.org). This interface allows leveraging 3D Slicer’s powerful features in VR, while the software also remains usable as a regular desktop application. Use cases in surgical planning, training, and education will be presented, along with practical advice how to implement such applications rapidly, with minimal development overhead.
Director, Strategic Partnerships & Innovation, FingerFood
Sarah Nahid is the Director of Innovation and Strategic Partnerships for Finger Food Advanced Technology group. Finger Food is a custom technology innovation company with a cross-industry global clientele including Fortune 500 companies. Sarah drives innovation across various industries (retail, gaming & entertainment, natural resources, financial sector, education) and is the SME for retail as well as health care verticals.She works with C-level executives and key stakeholders at her client organizations to co-develop a technology innovation strategy/solution and oversees the implementation of the resulting vision through working with clients and internal teams (content creatives, backend engineers, and project managers).Sarah also builds and grows partnerships with key organizations such as Microsoft, KPMG, etc.
2:15-3:15 – Digital Humans
Head of JLABS Canada
As Head of Johnson & Johnson Innovation, JLABS in Canada, Allan is responsible for external engagement, innovation sourcing, company onboarding, portfolio management, operational excellence, educational programming and P&L. He catalyzes and supports the translation of science and technology into valuable solutions for patients and consumers across the pharmaceutical, medical device, consumer and healthtech sectors.
Allan joined the JLABS team from Janssen Canada, where he has spent the last 12 years in positions of increasing responsibility in business development, marketing and market access. His most recent role was Therapeutic Lead Immunology and Primary Care where he was responsible for market access strategy for a complex product portfolio exceeding $1 billion. Allan started his career at PARTEQ Innovations, the technology transfer arm of Queen’s University at Kingston where he was responsible for technology assessment and new company start-up. He then moved to Paladin Labs, where he successfully completed numerous in-licensing transactions for specialty pharmaceutical products for the Canadian market. Allan subsequently worked in business development for two early stage biotechnology companies in Canada leading their business development and partnering initiatives prior to joining Janssen.
Allan received his Ph.D. in Neuropharmacology from Queen’s University at Kingston, Ontario and his MBA from McGill University in Finance and Strategy.
Aaron Leibtag is the CEO of Pentavere a highly regarded health technology company that combines Artificial Intelligence and Natural Language Processing to extract Real World Evidence (RWE) trapped in large repositories of unstructured healthcare data, such as clinical notes, transcription text, lab and diagnostic reports without having to incur the costs, inaccuracies, and time delays of manual chart review processes.
Before starting Pentavere, Aaron spent over 15 years leading organizational transformation, analytics, and operations at various global retail organizations and funds who owned consumer facing assets. Aaron is a former member of the Dean’s Advisory Council at Ryerson University’s School of Retail Management, vice chair of the Sinai Health System Volunteer Advisory Committee, and current board member at the Museum of Modern Contemporary Art Toronto.
Abstract: There is a serious problem that is affecting everyone in the health care community. That problem is we are all drowning in data but can still not efficiently and cost effectively access the important, meaningful, real world data that is required to deliver evidence-based healthcare. That is because over 80% of the health information we create is buried in large repositories of unstructured sources, such as clinical notes, transcription text, lab and diagnostic reports. Despite all the advances we hear and read about in technology today to access this data in a usable way, requires teams of individuals to manually comb through these electronic health records to get at the information that is needed. It is extremely time consuming, very expensive, accuracy varies, it is not replicable, and it is not scalable. Patients are waiting for us to do better.
Global Practice Lead: Healthcare & Life Sciences, IPSoft
S. Vincent Grasso is the Healthcare & Life Sciences Global Practice Lead for IPsoft Inc., the world’s leading private AI company. His role involves the design and construction of innovative enterprise clinical solutions incorporating the company’s autonomic and cognitive assets. Dr. Grasso leverages his subject matter expertise within the domains of clinical healthcare, technology, and business to optimize value contribution, market differentiation, and competitive advantage
Global healthcare costs are predicted to increase 5.4% between 2017-2022 from USD $7.7 trillion to $10.1 trillion. Drivers include an aging and expanding population, obesity, chronic diseases, advancing medical technology, and increasingly substance abuse. Sadly, an October 2019 Society of Actuaries Opioid Epidemic report estimates the 2015-2018 US economic burden at $631 billion. Fortunately, the evolving AI Ecosystem including conversational computing, robotic process automation, and machine learning provides abilities, capabilities, and capacities to assist healthcare delivery systems, government agencies, businesses, and others to address these drivers. Scalable deliverables include a multilingual / omnichannel conversational digital workforce, automation of redundant processes , and actionable data analytics. Leadership from healthcare, government, and business now recognize the critical role of AI Ecosystem assets in supporting specific and integrated lines of business required to combat the opioid epidemic. These include the SAMHSA Recovery framework, nutritional optimization, physical therapy, visiting nurse services, logistical support, telemedicine applications, RCM, and many others.
VP Platform Development, UneeQ
Tyler Merritt is a technology evangelist focused on spreading knowledge and enthusiasm to help others see the same vision of the future that he sees, and empower them with the connections and opportunity to help build it. As VP of Platform at UneeQ, he works on conversational AI using digital humans to add emotion to each customer interaction. Leveraging a wide range of developer experience, he brings technical knowledge and understanding across multiple organizations to the continued betterment of the digital human platform. He is driven to ensure every experience with an organization is a brand-led experience.
* Expanding on the platform of chatbots that allows chatbots to evolve their product to have a ‘human-like’ interface
* Bridging the gap between healthcare and patients as a new educational medium
* Bringing emotional connection to the digital world
3:15-3:00 – Break
3:30-4:15 – Debate: Humanistic Intelligence vs. Artificial Intelligence
Executive Director, Healthcare Human Factors, UHN
Dr. Joseph Cafazzo is the Lead for UHN’s Centre for Global eHealth Innovation, a state-of-the-art research facility devoted to the evaluation and design of healthcare technology, hosting seventy researchers and staff. As a biomedical engineer, Dr. Cafazzo observes healthcare delivery from the inside-out and works on ways to keep people out of hospital by creating technologies that allow for self-care at home.
He has created numerous award-winning mobile health apps, including bant, an application design for adolescents for the self-management for type 1 diabetes, which received the Stanford 2.0 Award. In 2018, he was named the inaugural holder of the Wolfond Chair in Digital Health. Furthermore, at the University of Toronto, Dr. Cafazzo is an Associate Professor at the Institute of Health Policy, Management and Evaluation and the Institute of Biomaterials and Biomedical Engineering.
Professor, Computer Engineering, University of Toronto
- Visiting Full Professor, Stanford University, Department of Electrical Engineering, Room 216, 350 Serra Mall, Stanford, CA 94305.
- Chair of the Silicon Valley Innovation & Entrepreneurship Forum (SVIEF).
- Founding Member of the IEEE Council on Extended Intelligence
- Marquis Who’s Who 2018 Albert Nelson Marquis Lifetime Achievement Award.
Invented wearable computing in his childhood, brought this invention to MIT to found the MIT wearable computing project, and “persisted in his vision and ended up founding a new discipline.” — Nicholas Negroponte, MIT Media Lab Director, 1997 (link).
Invented, designed, and built the world’s first smartwatch in 1998 (patent filed 1999, featured on cover of Linux Journal July 2000) which he presented at IEEE ISSCC 2000 where he was named “The father of the wearable computer”.
Inventor of HDR (High Dynamic Range) imaging, used in more than 2 billion smartphones. (“The first report of digitally combining multiple pictures of the same scene to improve dynamic range appears to be Mann 1993” — Robertson etal, JEI 12(2).) Originally developed HDR to help people see using his EyeTap Digital Eye Glass invention which predates the Google Glass by more than 30 years.
Founded companies with valuation in excess of $1 billion, together with students.