networks

  • New diagnostic tools leverage the power of Artificial Intelligence. Too Much Data to Process By 2030 all Baby Boomers will be over 65, which means that health-related issues become more and more top-of-mind for this aging population. Regular cancer screenings are one of these concerns. The sheer volume of such screenings already taxes existing healthcare systems. However, laboratories, diagnostic technicians, and healthcare providers are using powerful new technological tools to aid them in the work of helping patients live happier & healthier lives. One such tool is Artificial Intelligence, commonly referred to as AI. Unlike its counterparts depicted in the movies as sentient neural networks whose sole purpose is to destroy humanity, real AI has been a computing and data processing resource staple for decades. AI is as quotidian as the electric power grid and supermarkets. Everything from predictive weather modeling to aid meteorologists to CAD-based generative design for engineers, AI has proven to be a powerful tool for many industries in an everyday capacity. In the case of healthcare, data modeling and data processing have become synonymous with AI-driven environments capable of handling such massive volumes. Take for example liquid biopsies to better predict infant cancers. The data associated with these tests are referred to as high-throughput data. Making connections is essential between high-throughput data on orders of magnitude within a smaller outcome sample space of patient responses. The results of these AI-driven computations expedite determinations on whether or not they have cancer. Statistical models are useful for summarizing and describing variations to predictive models, and machine learning AI leverages these summaries that can make for more useful predictions, as seen above. Imaging for Data Collection and AI Processing From X-Rays, to CT (CAT) Scans, to MRIs, in vivo imaging technology has been one of the most powerful medical...
  • Blockchain described in one sentence: A blockchain is really a kind of database that’s shared across loads of different computers that are each running the same software; each bit of data is secured using some complicated bits of cryptography that means that only people that are meant to be adding to or editing the data can do that job. WIRED Magazine, 2018 Recently, I had a job interview with a company that builds their business model on providing add-on services for their customers’ databases built upon the open source centralized DB platform known as PostgreSQL. Probably the reason why I was considered to do digital copywriting for them is my previous experience writing for tech companies like PTC, Satcon, and L-1 to name a few. For PTC, I did a ton of writing for their PLM Product Marketing Group. PLM (i.e. product lifecycle management) is a massive technology platform and manufacturing methodology that relies heavily on data-driven digital thread content, product data management, and databases to name a few. So as you can see, I know a thing or two about databases. Also, I have written about Blockchain Technology (or BlockTech as I will be using this portmanteau from now on) in the past; therefore, I am well aware of the hot new trends for this distributed cutting-edge decentralized data-repository/processing platform. During the interview, I asked a simple question: How is your approach to utilizing a centralized DB value-add over the hot new decentralized DB technology trend known as Blockchain? Digital Batman’s Alter Ego, Nick, 2021 Needless to say, the developer that I was interviewing with did not really like the question all that much. His answer was more defensive rather than enlightening: “…centralized DBs are not going away anytime soon, so people need to understand that Blockchain is more like a curiosity right...