Category: Insights

Smart City Data Projects Need Transparency and Oversight

Experts at the MetroLab Network Annual Summit warned about the need for control of data-heavy public safety projects, while emphasizing the positive side of community engagement.

NEWARK, N.J. — The massive amounts of data collected by cities, and the analytics it enables, are often trumpeted as forces to grow the collective good, whether that is to make traffic move more smoothly or improve air quality.
With improper oversight and policy direction, however, that data can also lead to unjust policing or uncontrolled surveillance of communities, say researchers and policymakers who have studied the various types of smart cities technologies being deployed in municipalities across the country.
“As we look at the next few years, the big challenge, in my mind, is there’s no formal public oversight over technology in our cities,” said Ryan Gerety, a technology fellow at the Ford Foundation, speaking Oct. 15 at the MetroLab Network Annual Summit at Newark’s New Jersey Institute of Technology. “Cities, themselves, recognize this and are looking for mechanisms to correct that.”
Agencies with some of the biggest budgets and consequently amassing the most technology tend to be in areas of public safety, said Gerety, and that’s where oversight is often thinnest.
“We have many people in the room who are extremely expert at building systems to change, in very positive ways, communities, and people who are choosing to work with those city agencies who want to do that in the best way possible,” said Gerety during the panel discussion “Are Smart Cities Utopian or Dystopian?”
“The flip side of that is in places where you have much more regressive, say, police departments who want to do something different, who will go ahead and do it on their own, and where we don’t have a civil society that is informed in order to push back against illegal or inappropriate measures,” she added. “And so we need to have civil rights organizations, and social justice organizations, at the local level that have the technical capacity to fight back and evaluate these programs, using the best know-how we’ve learned to do it right, and we need legal, formal accountability.”
Three years ago Chicago rolled out its Array of Things project, an enterprise-scale, sensor-driven Internet of Things platform that collects data about the people, places and air quality in Chicago. The network gathers information related to the patterns of people as they move through the city, not data related to individuals.
Researchers wanted to design an infrastructure for research into intelligent infrastructure: street signals, real-time communications between infrastructure and vehicles, which require edge computing, as well as sensors to measure flooding. The data is updated every 30 seconds, and is free and open. The application programming interface (API) is updated every five minutes.
With 80 percent of Chicagoans living about two kilometers from one of the 100 sensor and camera pods, it was immediately obvious how a citywide IoT network like this one could raise concerns around privacy and using the data to further questionable activities like unjust policing or surveillance.
Advocacy groups, along with city and University of Chicago officials, came together to define a privacy policy that exceeded the state requirements in Illinois, and took that draft to communities for resident feedback, said Brenna Berman, the executive director of the City Tech Collaborative at U+I Labs in Chicago, who worked on the rollout of the Array of Things.
“We did not get a lot of pushback about those cameras, mainly because this was a community-based project, not a public safety-based project,” she said during the panel discussion. Residents wanted more access to the images in an effort to be more engaged in the policing of their neighborhoods.
“So the pushback wasn’t, ‘Hey, we don’t want surveillance.’ [It was,] ‘We want to have participation in that surveillance so that we can help to serve our advocacy and engagement within our community,’” Berman recalled.
One of the biggest lessons learned was, “listen to the residents that are going to be engaged in the project, because you can’t guess what they are going to think or need unless you actually ask,” said Berman.
The next evolution of the Array of Things project is to install more of the sensor pods for greater detailed readings, said Charlie Catlett, a senior computer scientist with the Argonne National Laboratory at the University of Chicago.
“Our goal is to make 100 percent of the people who live in Chicago have one of these within 2 kilometers of where they live,” said Catlett, during one of the MetroLab Summit sessions. “And we think we can push the 1 kilometer up to at least 70 to 80 percent of the population, at which point an air quality measurement or a noise measurement starts to mean something if it’s a kilometer away instead of 6 kilometers.”
The community IoT network project in Chicago — which brought together community and social justice advocacy groups — underscored how to move forward with sophisticated smart city projects dedicated to collecting and analyzing large amounts of data to bring about improvements to urban life, according to Berman.
“When you’re defining the project and the policy around it — whatever that might be — involve those advocacy groups, whether that’s the ACLU [American Civil Liberties Union], or a specific community representation group, so that they are part of the definition of the project,” she said. “You may not be able to implement everything they suggest. You may not be able to address every concern that they have. But essentially, having the advocacy voice in the tent to help you define the program can go a long way in defining a program that will more holistically understand what the perspective of the overall ecosystem is going to be.”

Smart Cities are Getting Smarter, But Challenges Remain

Ubiquitous sensors and applications are driving rapid growth for smart cities, but machine learning not yet advanced to cope with capacity demands.

Ubiquitous sensors in mobile robots, aerial drones, and autonomous vehicles, plus connections to municipal infrastructure through the Internet of Things, promise more efficient delivery of utilities and reduced traffic, among other things. While the variety of sensors and applications for smart cities has grown rapidly in recent years, a lot of work remains, especially in the areas of machine learning to analyze and interpret the data from these sensors, experts and observers said.

“[There is] still a long way to go to achieve a smart city,” said Mateja Kovacic, a visiting research fellow at the Urban Institute at the University of Sheffield, and a postdoctoral research fellow a the Nissan Institute of Japanese Studies, University of Oxford. Kovacic points to a number of notable examples in the smart-cities-in-progress space, including Barcelona, Spain, and Dubai.

In Barcelona, municipal authorities have installed smart solar trash cans, and free Wi-Fi routed via street lighting, as well as “sensors that monitor air quality and parking spaces,” Kovacic said.

In Dubai, which established a blockchain strategy aimed at creating the world’s first blockchain-powered government, the country also has an autonomous transportation strategy that seeks to make 25% of all transportation in the city autonomous by 2030.

“There are also efforts to make policing, security, governance, healthcare and public services more autonomous through artificial intelligence, which I see as an extension and expansion of the ‘smart’ paradigm,” Kovacic said.

Autonomous robotics in the next age

Joshua Meler, senior director of marketing at Hangar Technology, said he believes the world is entering an age in which autonomous robotics “will transform how companies operate, industries evolve, and economic opportunities are uncovered.”

Hangar develops a platform that combines drone hardware, software, and data analytics to enable autonomous drones to collect and interpret visual data. Meler said that up until now, the platform has been employed mainly for business uses – via drones that automate the end-to-end “aerial insight” supply chain. But he said it has also been rolled out for smart city infrastructure applications in controlled environments, and at a limited capacity.

“The next stage for Hangar is an era where computers augment visual insights, automating observations and alerting humans of areas that require attention,” Meler said. “This includes counting traffic, identifying cracks on bridges, recognizing inventory on construction sites and more — without human intervention required. We’re not there today, but as technology advances and as regulations facilitate autonomous operations, the Hangar platform will be capable of facilitating many of the applications of smart cities.”

Partnerships drive applications in Finland

The city of Tampere, Finland, is in the process of establishing several innovative and digital smart city solutions through cooperation between companies, organizations, municipalities, and citizens. Pirkko Laitinen, communications manager for Smart Tampere, said the aim is to “create better services for the citizens, and serve as a partner, a platform, and a reference for the companies on their way to the international markets.”

She said the strategic economic program approaches this in two ways. From the inside, the program is “taking the city’s own services to the digital age through agile testing.” On the outside, the program helps businesses “create new business models and smart city solutions through ecosystem building and platform creating.”

Pirkko Laitinen Smart Tampere smart cities article

Pirkko Laitinen, Smart Tampere

The program focuses on seven smart city themes that Laitenen said are strong in Tampere:

  • Mobility
  • Health
  • Industry
  • Governance and citizens
  • Research and education
  • Buildings and infrastructure
  • Connectivity

“One robot-based new business model we have created with companies is the SmartMile delivery service points, which are in shared use among all parcel delivery service providers,” Laitinen said. “[This] means that online store customers can receive all their orders in one place. The robotics inside the machine is done by Konecranes.”

Machine learning needs to increase capacity

With increasing demands placed upon machine learning by smart city applications, Kovacic said she believes it will soon be advanced enough to cope with those demands.

The main challenge, she said, is that current machine learning “does not yet have the capacity to handle the quantity of data, and is not autonomous enough to analyze data without human intervention.” Another challenge is integrating different physical and virtual technologies necessary to make a smart city genuinely smart.

“The existing challenges can be overcome by further work on machine learning technology and nurturing a mindset with a holistic, integrative approach,” Kovacic said. “But there is no leapfrogging here, it simply takes time.”

Smart Tampere automobile smart cities

Automobiles with sensors will provide data to smart cities. Source: Smart Tampere

“Another step toward overcoming existing challenges is being more aware that the physical technology, like robots, is an integral aspect of machine learning and vice versa,” she added. “There is no place for mind-body dualisms – or virtual-physical – there needs to be an attempt at integration. Lastly, cybersecurity and data privacy and protection are among the main issues and will need to be dealt with utmost care and consideration for individual and social rights and needs.”

Meanwhile, Laitinen pointed out that machines can currently learn simple tasks and that the technology is developing as the algorithms get better.

“As a city, we are still learning about what would be the best way to gather data from multiple different areas into one pool,” Laitenen said, “and how to analyze it in order to offer it for the companies to use.”

The skies will get smarter before the ground

Hangar’s Meler said he believes key sensor innovation and smart city trends “will happen in the sky before they happen on the ground.” The path to an autonomous world “must first rise up, in a largely uninhabited space free of children chasing soccer balls across the light, running groups beating the crosswalk light, or distracted drivers listening to the radio and texting a friend,” Meler said.

“The fact is, completely autonomous drones are years away, while cars and robotics will take at least a decade before they prove safe at scale,” he added. “For this reason, I think we’ll see meaningful innovation [in the air] first. Drone hardware will enable heavier payloads and longer flight times. Sensors will get smaller, better and cheaper. Governments and industries will lift regulations and restrictions. And this Solow’s Paradox we’re experiencing with digitization will hit a tipping point, and we’ll enter a new age of productivity.”

Kovacic said she envisions the “full integration of vehicles, drones, and robot-mounted sensors with the city through IoT,” particularly since “the quantity of data a city can produce exceeds human capacity, and needs a sophisticated network of everything.”

“Swarm technology is very promising and can be applied in vehicles, drones and different robots to produce collective action and decision-making,” Kovacic said. “Another key innovation may be decentralization. Unlike the old smart-city paradigm, where stationary sensors and cameras collect data and send it to a centralized system for analysis, the analysis and decision-making will become dispersed, decentralized and more efficient and instantaneous.”

“[A] smart city will no longer be a static accumulator of data but will become extended through mobile technology with capability to interact with each other and instantaneously make decisions based on this interaction without human intervention,” she added.

Even so, Kovacic said she suspects such developments will take more than a few years, and envisions a proliferation of various robots, such as drones for e-commerce, shared autonomous vehicles, service and retail robots, and city maintenance swarm robots. She also said she expects to see more machine learning-enhanced services across a smart city, from governance to the service industry.

“In California, there is currently underway a pilot project where autonomous vehicles pick up passengers, and delivery robots deliver groceries and food,” Kovacic said. “These are just two examples of what we can expect from future smart city applications – but only when these technologies are also connected and interact with the city — which they are currently not — and when there is a feedback loop between them and the city – a truly cybernetic city.”

Human Brain-Sized Artificial Intelligence (AI): Coming Soon To A Cloud Data Center Near You

Data center-hosted artificial intelligence is rapidly proliferating in both government and commercial markets, and while it’s an exciting time for AI, only a narrow set of applications is being addressed, primarily limited to neural networks based on convolutional approach. Other categories of AI include general AI, symbolic AI and bio-AI, and all three require different processing demands and run distinctly different algorithms. Virtually all of today’s commercial AI systems run neural network applications. But much more control-intensive and powerful AI workloads using symbolic AI, bio-AI and general AI algorithms are ill-suited to GPU/TPU architectures.

Today, commercial and governmental entities that need AI solutions are using workarounds to achieve more compute power for their neural net applications, and chief among them is specialty processors like Google TPUs and NVIDIA GPUs, provisioned in data centers specifically for AI workloads.

However, using TPUs and GPUs, even if they are dedicated to AI processing tasks, can still be problematic. It drives up data center capital expenditures for AI-specific processors, and it drives up costs for software development (e.g., GPUs are notoriously difficult to program). In most hyperscale data centers today, there exists a combination of standard CPUs for normal data center workloads and specialty TPUs or GPUs (comprising approximately 5-10% of server rack space) dedicated to AI/neural net processing.

CPUs are easy to program but become slow and power-hungry when tasked with highly parallel AI applications. Specialty AI processors are faster and more power efficient than CPUs for neural net applications, but they are difficult to program.

Today, if embarrassingly parallel computation is the goal (i.e., executing each instruction mindlessly on a large number of data sets), such as in convolutional neural networks, TPUs/GPUs are a go-to solution. They are more efficient (and in the case of TPUs, they can be up to 30x faster) than CPUs for convolutional neural net processing. This is because the action of fetching and scheduling an instruction uses significantly more power than actually executing that instruction on a single data set. A specialty AI processor, such as a GPU, will fetch a single instruction and execute that instruction on 32 datasets simultaneously (maximizing throughput and minimizing power).

Google recently announced its third-generation TPU, which is still nowhere near the performance needed for real-time human brain simulation projects. And general AI, bio-AI and symbolic AI algorithms are not a good match for GPU/TPU processors.

The human brain needs to process huge amounts of information in order to take action in real time, and this requires massive processing power. Today’s supercomputers don’t even come close to the processing power of the human brain (which is approximately 1019 floating point operations per second). One of the fastest supercomputers on the planet today, China’s Sunway TaihuLight, with 10,649,600 cores, can achieve 93 petaflops (Rmax on Linpack benchmark suite). That’s a tiny fraction of what we need for simulation of the human brain in real time, which requires approximately 1019 flops (that’s 10 exaflops, or 10,000 petaflops).

We have a long way to go, but we are getting there. In fact, I predict it will be about two years, give or take.

If you’re not yet familiar with ongoing efforts to build a super supercomputer, one capable of simulating a human brain, consider the Human Brain Project, which was established by the European Union in 2013 to unite the fields of neuroscience, medicine and computing for both commercial and research needs.

SpiNNaker (spiking neural network architecture), which is part of the Human Brain Project, is being led by professor Steve Furber (the inventor of the ARM processor and current member of Tachyum’s Board of Advisors) at Manchester University. SpiNNaker’s goal is to simulate the equivalent of a rat brain (about 1000x less than a human brain) in real time, using around 1 million ARM processors configured as a spiking neural network, which simulates neuronal activity more accurately and uses much less power than “embarrassingly parallel” neural nets. If your brain was a neural network, it would boil inside your skull.

Along with the examples described above, my company, Tachyum, is working on a breakthrough processor architecture called Prodigy. Prodigy architecture offloads heavy lifting tasks normally done in hardware to a Tachyum-proprietary smart compiler.

It’s only taken you about four minutes to read this article. During that time, people searched the web almost 14 million times, logged into Facebook 3.8 million times, tweeted 1.8 million times, watched more than 17 million YouTube videos, and swiped right or left on 4.4 million Tinder profiles.

When cloud-based data centers offer users AI applications at a reasonable cost, tasks like manually looking at Tinder profiles and then swiping will seem downright archaic. The new data and AI centers will know which profiles to flag for you, and they will know which YouTube videos you will want to watch. Sooner than you think, data centers will be the place to access low-cost AI solutions for everyone.

New NVIDIA Data Center Inference Platform to Fuel Next Wave of AI-Powered Services

Tesla T4 GPU and New TensorRT Software Enable Intelligent Voice, Video, Image and Recommendation Services

Fueling the growth of AI services worldwide, NVIDIA today launched an AI data center platform that delivers the industry’s most advanced inference acceleration for voice, video, image and recommendation services.

The NVIDIA TensorRT™ Hyperscale Inference Platform features NVIDIA® Tesla® T4 GPUs based on the company’s breakthrough NVIDIA Turing™ architecture and a comprehensive set of new inference software.

Delivering the fastest performance with lower latency for end-to-end applications, the platform enables hyperscale data centers to offer new services, such as enhanced natural language interactions and direct answers to search queries rather than a list of possible results.

“Our customers are racing toward a future where every product and service will be touched and improved by AI,” said Ian Buck, vice president and general manager of Accelerated Business at NVIDIA. “The NVIDIA TensorRT Hyperscale Platform has been built to bring this to reality — faster and more efficiently than had been previously thought possible.”

Every day, massive data centers process billions of voice queries, translations, images, videos, recommendations and social media interactions. Each of these applications requires a different type of neural network residing on the server where the processing takes place.

To optimize the data center for maximum throughput and server utilization, the NVIDIA TensorRT Hyperscale Platform includes both real-time inference software and Tesla T4 GPUs, which process queries up to 40x faster than CPUs alone.

NVIDIA estimates that the AI inference industry is poised to grow in the next five years into a $20 billion market.

The NVIDIA TensorRT Hyperscale Platform includes a comprehensive set of hardware and software offerings optimized for powerful, highly efficient inference. Key elements include:

NVIDIA Tesla T4 GPU – Featuring 320 Turing Tensor Cores and 2,560 CUDA® cores, this new GPU provides breakthrough performance with flexible, multi-precision capabilities, from FP32 to FP16 to INT8, as well as INT4. Packaged in an energy-efficient, 75-watt, small PCIe form factor that easily fits into most servers, it offers 65 teraflops of peak performance for FP16, 130 TOPS for INT8 and 260 TOPS for INT4.
NVIDIA TensorRT 5 – An inference optimizer and runtime engine, NVIDIA TensorRT 5 supports Turing Tensor Cores and expands the set of neural network optimizations for multi-precision workloads.

NVIDIA TensorRT inference server – This containerized microservice software enables applications to use AI models in data center production. Freely available from the NVIDIA GPU Cloud container registry, it maximizes data center throughput and GPU utilization, supports all popular AI models and frameworks, and integrates with Kubernetes and Docker.

Supported by Technology Leaders Worldwide

Support for NVIDIA’s new inference platform comes from leading consumer and business technology companies around the world.

“We are working hard at Microsoft to deliver the most innovative AI-powered services to our customers,” said Jordi Ribas, corporate vice president for Bing and AI Products at Microsoft. “Using NVIDIA GPUs in real-time inference workloads has improved Bing’s advanced search offerings, enabling us to reduce object detection latency for images. We look forward to working with NVIDIA’s next-generation inference hardware and software to expand the way people benefit from AI products and services.”

Chris Kleban, product manager at Google Cloud, said: “AI is becoming increasingly pervasive, and inference is a critical capability customers need to successfully deploy their AI models, so we’re excited to support NVIDIA’s Turing Tesla T4 GPUs on Google Cloud Platform soon.”

Top Cryptocurrency and Blockchain Startups in Healthcare

The impact of blockchain is no longer a matter of imagination and science fiction. Tech behemoths such as IBM have spent significant resources towards conducting comprehensive studies to understand how vital blockchain can be for healthcare, and dozens of high-octane blockchain startups have set out to revolutionize the healthcare space for the better.

The intersection of blockchain, tokenization, and smart contracts with the healthcare industry paves the way for new and highly innovative solutions to solve problems that have been festering for decades.

The startups in this article are utilizing the above technologies to fight dangerous and rampant global pharmaceutical counterfeiting, streamline the transmission and storage of medical records, and even create long-term incentivization-based loyalty programs for dental communities.

1. Fighting Fake Pharma with MediLedger

The pharmaceutical industry has been slowly bleeding out to the tune of $75 billion every year due to counterfeit drugs. With over 100,000 deaths worldwide directly linked to the use of these counterfeit drugs, fake pharma not only poses a global threat to corporate innovation but also to thousands of people unknowingly taking drugs that may end up killing them.

The fake pharma problem doesn’t show any signs of slowing down; experts estimate that an investment of $1000 in a counterfeit prescription drug operation could result in a $30,000 return – nearly ten times the profit of trafficking heroin. To spice up the deal for pharma bootleggers, the criminal penalties for selling counterfeit medications are far less than dealing their more illegal and nefarious cousins, making the profit margins higher and the risks lower.

The rise of internet pharmacies has also made it incredibly difficult for authorities to keep up with the tracking and verification of billions of pills shipped out every year. United States consumers are largely oblivious to the dangers of purchasing drugs online, as an estimated 36 million Americans have been able to buy drugs online without a valid prescription. Stricter regulatory supervision would require a technological asset to help track supply chains at scale.

That’s where MediLedger comes in. MediLedger is an open and decentralized network for the pharmaceutical supply chain. Blockchain has long been praised for its supply chain applications, and it has already been used broadly in large shipping networks that share similar logistical concerns as the pharmaceutical industry.

Using a project such as MediLedger allows pharmacists and patients to verify the legitimacy of their drugs, and even see minute details such as the date and manufacturing details for each order. MediLedger utilizes features such as compliance, track, trace, and security protocols to create a cost-effective GS1-compliant verification system for pharmaceutical companies. Regulators are also able to use the MediLedger platform to obtain crucial information whenever needed.

2. Decreasing the Patient’s Need for Patience with MedicalChain

Medical records create several cumbersome responsibilities for doctors, hospitals, pharmacists, and patients, particularly for maintenance and storage.

Medicalchain utilizes blockchain to securely store and maintain health records in a single location, allowing for different organizations such as doctor offices, hospitals, pharmacists, laboratories, and pharmacists to request permission to access each patient’s record.

Each patient would essentially be assigned a unique blockchain fingerprint that verifies their identity and provides them will full access and control over their data. Patients will be able to grant different levels of access to various users by setting access permissions and designating who is able to write data to their blockchain.

This feature is particularly notable in the context of the emerging trend of telemedicine, or the online consultation between doctors and patients using a webcam. Medicalchain would make it possible for patients to grant their digital doctor access to their records, ultimately leading to more in-depth consultations.

Maintaining medical records on the Medicalchain blockchain would not only offer higher degrees of flexibility in treatment, but is also much less expensive than current methods.

3. Incentivizing Patient Responsibility with Dentacoin

Although it might not be your typical healthcare startup, Dentacoin showcases a unique consumer-facing application of blockchain and smart contracts.

For many people, going to the dentist is an experience forced by circumstance rather than preventative care, and procedures such as root canals can be very expensive and painful. $440 billion is spent on dental treatment globally every year, and according to the American Dental Association, 90% of these expenses can be avoided if patients visit a dentist at least three times a year for early preventative treatments and checkups, while also establishing healthier dental habits like flossing and better nutrition.

Dentacoin removes the need for big insurance companies acting as intermediaries via smart contracts that link patients directly with dentists.

Essentially, each smart contract encourages patients to take control of their preemptive dental care such as flossing, proper nutrition, and routine check-ups with dentists. Additionally, patients can earn rewards by providing accurate reviews for each visit to help increase the overall quality of the dental community. Since these reviews are stored on the blockchain, they’re immutable and can’t be manipulated.

Dentists are incentivized to ensure long-term dental success for each patient using the Dentacoin network and are rewarded in small monthly contributions of the Dentacoin token, which can later be sold for fiat currency on a variety of exchanges.

Since its launch on August 17th, 2017, there are over 4093 dentists using the Dentacoin platform.

Final Thoughts

Blockchain enables a fundamental increase in the technological capabilities of multiple sectors that focus on the well-being and longevity of the human population. The startups listed above are not only showcasing how this new technology can impact such a critical industry such as healthcare, but they’re also paving the way for countless future startups and applications.

Top Blockchain Applications Making Waves in Commercial Real Estate

The commercial real estate (CRE) industry is comprised of many different types of service providers, including property management, brokerage firms, banks, and other types of lenders. When a CRE transaction takes place, there are various operators involved, requiring extensive sharing of official property documents, and financial information which need to be validated. The requirements for validating all information across all parties slows down the speed of each transaction, which can take weeks and months to complete. Many CRE firms have turned to blockchain to speed up execution times, decrease error and increase transparency in each transaction.

What Is Blockchain?

Blockchain technology is a way to store and transfer information in an encrypted manner by distributing data instead of copying it in a central location. Blockchain does so through a cloud, peer-to-peer network that eliminates the need for a third party, which ultimately reduces transaction fees. A digital ledger is then created and updated with each financial transaction in blocks.

There are plenty of benefits to making transactions and transferring data using blockchain as the technology is not controlled by one central entity, such as a central bank. This means that breaching these blocks is extremely difficult, maintaining the sanctity and transparency of its transactions and data.

Blockchain is the backbone of cryptocurrencies such as Bitcoin, which offer speedy and low-cost ways of sending and receiving money.

Faster Transactions

One of the most exciting ways blockchain is disrupting the CRE world is in the form of smart contracts. The industry currently relies on an inefficient system of old-school verification of property ownership by conducting research to ensure the property belongs to the party who is selling it.

Blockchain can reduce the speed in which the chain of custody regarding CRE properties takes place as a property’s title would be stored on a public ledger. This would remove the need for another central repository, thus reducing transaction, state, city and legal costs. The same principle would apply for leases that would be recorded via blockchain.

More Transparent Deals

Blockchain can also ensure that real estates assets are more liquid and the terms of the agreement are fully understood by both sides as every piece of data regarding a property would be stored publicly. This includes data surrounding former owners, construction done on the property, past maintenance costs and records regarding former inspections.

Having all this information available would give the investor a more comprehensive idea of the property they are investing in. Blockchain essentially ensures that everyone is on the same page and both sides are fully aware of what they’re getting into as every piece of information is out there for anyone to access.

Digital Paper Trail

Another challenge with the CRE industry is the fact that public records can be outdated, unreliable or not available. Following a property’s paper trail can be time-consuming and frustrating as a lot of this information is lost due to poor organizational skills from industry workers and legacy systems that lose data when updated.

With blockchain, every piece of information on a property would be available in the same place rather than in multiple physical and digital domains. Blockchain would also help to eliminate the type of fraud that sometimes exist in the industry as deeds and titles can be counterfeited easily.

Buying Property With Cryptocurrencies

As previously mentioned, Bitcoin is a cryptocurrency that relies on blockchain to complete financial transactions online in a matter of seconds. Some investors and real estate firms have started adding Bitcoin to the industry, including Ivan Pacheco, who bought a two-bedroom condominium in Florida for $275,000 in Bitcoin.

In the residential space, you can buy a condo on the Lower East Side of Manhattan with Bitcoin. Meanwhile, some apartments in New York City are allowing their tenants to pay for rent using Bitcoin. Cryptocurrencies have been historically volatile and they’ve been on the decline since peaking in December 2017, but some investors believe that the future of real estate will be closely tied with Bitcoin and other digital coins.

Nevertheless, blockchain’s role in the CRE industry is becoming more prevalent each day. The technology’s potential to speed up transactions with smart contracts, its ability to add transparency to a deed or title and the fact that it dramatically decreases the chance for fraud suggest that more investors will flocks towards firms that use blockchain for CRE transactions.

DARPA’s betting $2B on your next AI innovation

DARPA’s betting $2B on your next AI innovation

Raspberry Pi Internet of Things IoT LoRa HAT

Raspberry Pi Internet of Things IoT LoRa HAT

Wall Street Forays Into Crypto, Bitcoin Stabilizes, Could It Be A Coincidence?

Wall Street Forays Into Crypto, Bitcoin Stabilizes, Could It Be A Coincidence?


Scroll to top