Category: Insights

KPMG Report Lays Out Incredibly Bullish Case for Crypto Assets

KPMG Report Lays Out Incredibly Bullish Case for Crypto Assets

Is blockchain a solution to IoT security problems?

The Internet of Things (IoT) has captured businesses and popular imagination over the last few years.

These smart, sensor-laden devices can autonomously exchange data across the internet without human intervention, creating business opportunities and opening new markets.

However, IoT has significant problems.

 It’s ripe for manipulation by criminal elements, who can harness unsecure IoT devices to create massive DDoS attacks, or simply access the data streaming through the IoT network for illicit gain. Put simply, IoT presents a challenge from a security perspective, and without standards, the situation is going to get worse.

In fact, Gartner predicts that by 2020, addressing compromises in IoT security will have increased security costs to 20% of annual security budgets, up from less than one percent in 2015.

Despite the costs, there’s no doubt that business is doubling down on IoT.

Analysts report that there will be more than 55 billion IoT devices by 2025, up from about 9 billion in 2017.

The same report finds that there will be nearly $US15 trillion in aggregate IoT investment between 2017 and 2025, and it also reports that companies’ plans to invest in IoT are accelerating.

Another technology that has recently captured our collective imagination is blockchain, which was built to underpin and authenticate cryptocurrency transactions.

Australia’s CSIRO summarises blockchain as a ‘cryptographically secured, immutable distributed ledger technology.’

Put another way, blockchain is a system of tracking almost anything, from transactions to digital identity to the provenance of goods, in a way that can’t be faked or forged.

Could using blockchain to trace and authenticate IoT data, regardless of what that data is be the answer to the problems plaguing the Internet of Things?

Blockchain broke into the business world with the rise of Bitcoin, but its uses have grown exponentially since then.

Despite this, Gartner’s hype cycle finds that blockchain has moved into the third stage of its lifecycle – the trough of disillusionment.

From here, only the most promising of blockchain technology applications will survive.

Using blockchain to determine provenance

One company that is exploring the use of blockchain with IoT is Sydney-based and China-backed startup Ultimo Digital Technologies (UDT).

The company is headed up by John Baird, a former CSIRO experimental scientist and the chair of the cybersecurity advisory council advising the New South Wales government.

UDT is experimenting with blockchain and IoT to track the integrity of goods such as baby formula and wine, both of which are easily forged and then sold to unsuspecting consumers.

China, in particular, has had massive difficulties with fake baby formula, with one instance of forged formula resulting in deaths and extended illnesses among the babies fed the illegitimate foodstuff. Wine forgery also remains a problem in growth markets like China.

The UDT trials involve using microchip embedded labels, the details of which are stored in the blockchain.

This enables the goods to be tracked from the moment they are produced, through the supply chain and to the point of sale and the final consumer purchase.

By storing the data in a blockchain, the details can’t be forged, ensuring the integrity of the end product.

So where can blockchain and the Internet of Things work together?

It’s worth considering the threats faced by IoT, including unauthorised physical access to the device, as well as software attacks, such as viruses and worms, in addition to the denial of service attacks taking down IoT networks, and the potential for man in the middle attacks, where passwords are guessed using brute force methods.

Blockchain could mitigate these threats by providing a framework for more automated security and attack prevention.

These advances using blockchain include creating a distributed system of record for sharing data across a network of key stakeholders, as well as embedding business terms for automating interactions between nodes in the system.

It also could enable consensus and agreement models for detecting bad actors and mitigating threats.

Using these techniques, a blockchain-enabled IoT deployment could improve security by allowing them to register and verify themselves against the network.

More importantly, because there is no central system to attack with blockchain, threats like denial of service attacks would be deterred by the nature of the system.

Blockchain would enable real-world business benefits, such as allowing data tracking and the creation of an immutable history of why certain decisions were made by an IoT device.

It would also permit secure software updates, as well as payments and micropayments for the completion of an IoT service or product delivery.

One last hurdle

Despite these business benefits, blockchain and IoT are still unlikely bedfellows.

That’s because the current performance and scalability of IoT are incompatible with blockchain functions.

Basically, what’s needed is a new type of blockchain that can support those predicted 55 billion devices.

And that hasn’t happened yet, however, chances are that it will, and soon.

Blockchain represents the best potential answer for solving the problems that ail IoT.

It won’t solve every problem, but there’s a good chance that blockchain will improve IoT and make it fit for purpose – that’s because, like it or not, we’re in the age of IoT now, and it’s only going to grow from here.

Machine Learning to Help Optimize Traffic and Reduce Pollution

Applying artificial intelligence to self-driving cars to smooth traffic, reduce fuel consumption, and improve air quality predictions may sound like the stuff of science fiction, but researchers at the Department of Energy’s Lawrence Berkeley National Laboratory (Berkeley Lab) have launched two research projects to do just that.

In collaboration with UC Berkeley, Berkeley Lab scientists are using deep reinforcement learning, a computational tool for training controllers, to make transportation more sustainable. One project uses deep reinforcement learning to train autonomous vehicles to drive in ways to simultaneously improve traffic flow and reduce energy consumption. A second uses deep learning algorithms to analyze satellite images combined with traffic information from cell phones and data already being collected by environmental sensors to improve air quality predictions.

“Thirty percent of energy use in the U.S. is to transport people and goods, and this energy consumption contributes to air pollution, including approximately half of all nitrogen oxide emissions, a precursor to particular matter and ozone – and black carbon (soot) emissions,” said Tom Kirchstetter, director of Berkeley Lab’s Energy Analysis and Environmental Impacts Division, an adjunct professor at UC Berkeley, and a member of the research team.

“Applying machine learning technologies to transportation and the environment is a new frontier that could pay significant dividends – for energy as well as for human health.”

Traffic smoothing with Flow

The traffic-smoothing project, dubbed CIRCLES, or Congestion Impact Reduction via CAV-in-the-loop Lagrangian Energy Smoothing, is led by Berkeley Lab researcher Alexandre Bayen, who is also is a professor of electrical engineering and computer science at UC Berkeley and director of UC Berkeley’s Institute of Transportation Studies. CIRCLES is based on a software framework called Flow, developed by Bayen’s team of students and post-doctoral researchers.

Flow is a first-of-its-kind software framework allowing researchers to discover and benchmark schemes for optimizing traffic. Using a state-of-the-art open-source microsimulator, Flow can simulate hundreds of thousands of vehicles – some driven by humans, others autonomous – driving in custom traffic scenarios.

“The potential for cities is enormous,” said Bayen. “Experiments have shown that the energy savings with just a small percentage of vehicles on the road being autonomous can be huge. And we can improve it even further with our algorithms.”

Flow was launched in 2017 and released to the public in September, and the benchmarks are being released this month. With funding from the Laboratory Directed Research and Development program, Bayen and his team will use Flow to design, test, and deploy the first connected and autonomous vehicle (CAV)-enabled system to actively reduce stop-and-go phantom traffic jams on freeways.

How reinforcement learning can reduce congestion

Some of the current research into using autonomous vehicles to smooth traffic was inspired by a simple experiment done by Japanese researchers 10 years ago in which about 20 human drivers were instructed to drive in a ring at 20 mph. At first everyone is proceeding smoothly, but within 30 seconds, the traffic waves start and cars come to a standstill.

“You have stop-and-go oscillation within less than a minute,” Bayen said. “This experiment led to hundreds if not thousands of research papers to try to explain what is happening.”

A team of researchers led by Dan Work of Vanderbilt University repeated the same experiment last year but made one change: they added a single autonomous vehicle in the ring. As soon as the automation is turned on, the oscillations are immediately smoothed out.

Why? “The automation essentially understands to not accelerate and catch up with the previous person – which would amplify the instability – but rather to behave as a flow pacifier, essentially smoothing down by restraining traffic so that it doesn’t amplify the instability,” Bayen said.

Deep reinforcement learning has been used to train computers to play chess and to teach a robot how to run an obstacle course. It trains by “taking observations of the system, and then iteratively trying out a bunch of actions, seeing if they’re good or bad, and then picking out which actions it should prioritize,” said Eugene Vinitsky, a graduate student working with Bayen and one of Flow’s developers.

In the case of traffic, Flow trains vehicles to check what the cars directly in front of and behind them are doing. “It tries out different things – it can accelerate, decelerate, or change lanes, for example,” Vinitsky explained. “You give it a reward signal, like, was traffic stopped or flowing smoothly, and it tries to correlate what it was doing to the state of the traffic.”

With the CIRCLES project, Bayen and his team plan to first run simulations to confirm that significant energy savings result from using the algorithms in autonomous vehicles. Next they will run a field test of the algorithm with human drivers responding to real-time commands.


The pollution project, named DeepAir (Deep Learning and Satellite Imaginary to Estimate Air Quality Impact at Scale), is led by Berkeley Lab researcher Marta Gonzalez, who is also a professor in UC Berkeley’s City & Regional Planning Department. In past research, she has used cell phone data to study how people move around cities and to recommend electric vehicle charging schemes to save energy and costs.

For this project, she will take advantage of the power of deep learning algorithms to analyze satellite images combined with traffic information from cell phones and data already being collected by environmental monitoring stations.

“The novelty here is that while the environmental models, which show the interaction of pollutants with weather – such as wind speed, pressure, precipitation, and temperature – have been developed for years, there’s a missing piece,” Gonzalez said. “In order to be reliable, those models need to have good inventories of what’s entering the environment, such as emissions from vehicles and power plants.

“We bring novel data sources such as mobile phones, integrated with satellite images. In order to process and interpret all this information, we use machine learning models applied to computer vision. The integration of information technologies to better understand complex natural system interactions at large scale is the innovative piece of DeepAir.”

The researchers anticipate that the resulting analysis will allow them to gain insights into the sources and distribution of pollutants, and ultimately allow for the design of more efficient and more timely interventions. For example, the Bay Area has “Spare the Air” days, in which traffic restrictions are voluntary, and other cities have schemes to restrict traffic or industry.

While the idea of using algorithms to control cars and traffic may sound incredible at the moment, Bayen believes technology is headed in that direction. “I do believe that within 10 years the things we’re coming up with here, like flow smoothing, will be standard practice, because there will be more automated vehicles on the road,” he said.

4 human-caused biases we need to fix for machine learning

Bias is an overloaded word. It has multiple meanings, from mathematics to sewing to machine learning, and as a result it’s easily misinterpreted.

When people say an AI model is biased, they usually mean that the model is performing badly. But ironically, poor model performance is often caused by various kinds of actual bias in the data or algorithm.

Machine learning algorithms do precisely what they are taught to do and are only as good as their mathematical construction and the data they are trained on. Algorithms that are biased will end up doing things that reflect that bias.

To the extent that we humans build algorithms and train them, human-sourced bias will inevitably creep into AI models. Fortunately, bias, in every sense of the word as it relates to machine learning, is well understood. It can be detected and it can be mitigated — but we need to be on our toes.

There are four distinct types of machine learning bias that we need to be aware of and guard against.

1. Sample bias

Sample bias is a problem with training data. It occurs when the data used to train your model does not accurately represent the environment that the model will operate in. There is virtually no situation where an algorithm can be trained on the entire universe of data it could interact with.

But there’s a science to choosing a subset of that universe that is both large enough and representative enough to mitigate sample bias. This science is well understood by social scientists, but not all data scientists are trained in sampling techniques.

We can use an obvious but illustrative example involving autonomous vehicles. If your goal is to train an algorithm to autonomously operate cars during the day and night, but train it only on daytime data, you’ve introduced sample bias into your model. Training the algorithm on both daytime and nighttime data would eliminate this source of sample bias.

2. Prejudice bias

Prejudice bias is a result of training data that is influenced by cultural or other stereotypes. For instance, imagine a computer vision algorithm that is being trained to understand people at work. The algorithm is exposed to thousands of training data images, many of which show men writing code and women in the kitchen.

The algorithm is likely to learn that coders are men and homemakers are women. This is prejudice bias, because women obviously can code and men can cook. The issue here is that training data decisions consciously or unconsciously reflected social stereotypes. This could have been avoided by ignoring the statistical relationship between gender and occupation and exposing the algorithm to a more even-handed distribution of examples.

Decisions like these obviously require a sensitivity to stereotypes and prejudice. It’s up to humans to anticipate the behavior the model is supposed to express. Mathematics can’t overcome prejudice.

And the humans who label and annotate training data may have to be trained to avoid introducing their own societal prejudices or stereotypes into the training data.

3. Measurement bias

Systematic value distortion happens when there’s an issue with the device used to observe or measure. This kind of bias tends to skew the data in a particular direction. As an example, shooting training data images with a camera with a chromatic filter would identically distort the color in every image.  The algorithm would be trained on image data that systematically failed to represent the environment it will operate in.

This kind of bias can’t be avoided simply by collecting more data. It’s best avoided by having multiple measuring devices, and humans who are trained to compare the output of these devices.

4. Algorithm bias

This final type of bias has nothing to do with data. In fact, this type of bias is a reminder that “bias” is overloaded. In machine learning, bias is a mathematical property of an algorithm. The counterpart to bias in this context is variance.

Models with high variance can easily fit into training data and welcome complexity but are sensitive to noise. On the other hand, models with high bias are more rigid, less sensitive to variations in data and noise, and prone to missing complexities. Importantly, data scientists are trained to arrive at an appropriate balance between these two properties.

Data scientists who understand all four types of AI bias will produce better models and better training data. AI algorithms are built by humans; training data is assembled, cleaned, labeled and annotated by humans. Data scientists need to be acutely aware of these biases and how to avoid them through a consistent, iterative approach, continuously testing the model, and by bringing in well-trained humans to assist.

6 Projects And Platforms That Are Making It Easy To Use Cryptocurrency In Everyday Life

Cryptocurrency has many advantages that make it a better monetary system than has ever existed before. But all their benefits don’t matter if cryptocurrencies themselves aren’t actually being put to use in the real world. In order for cryptocurrencies to realize their potential, there has to be a certain amount of infrastructure making it possible for people to make them a part of their daily lives.

As a method of payment between 2 individual people, transferring crypto to each other is as easy as scanning a QR code.

For a business, government, or other institution, there are many more layers of the process to consider. A payment for a good or service at a place of business has to be recorded in their existing accounting systems, they may need to have ways for multiple employees to handle payments simultaneously, and there may be government regulations that affect the process.

For corporations or other institutions to participate in the growing cryptocurrency economy, they will need systems that help make cryptocurrency transactions compliant with existing protocols. Many companies are stepping up to try and help build those systems.

Here we list 6 notable efforts to make it easier to use crypto for everyday transactions.

6 Cryptocurrencies and Platforms Made for Everyday Use

Metal Pay

Metal Pay functions only in the United States, where they are effectively a bank. You can open a fiat account with them, one that is accredited and regulated like any other bank.

Where they diverge from a traditional bank, though, is in their associated cryptocurrency token, MTL.

When you make regular purchases using the soon-to-be released Metal Pay card or app, you earn MTL pay tokens through a system called Proof of Processed Payment, or PoPP, or also just “pop.” The idea is to create and distribute new coins to people who use their system, instead of the usual system most other cryptocurrencies employ of rewarding miners.

With Metal Pay, you are reimbursed roughly 5-10% of your purchase amount in equivalent MTL (although there are a few layers of math on top of that which regulate maximum daily payouts, so the actual amount won’t be a straight percentage). The goal is to keep distribution balanced between rewarding those who use the system while not devaluing the currency by releasing too many tokens too quickly.

In this system, MTL is put in the hands of users who are just spending regular fiat money, almost like a loyalty point system offered by many retailers and credit cards. Once users start accumulating MTL, the next logical question one can assume they’ll ask is, what to do with it?

Metal Pay hopes that the time will come when MTL will be as much a currency as anything else, and so people will use their MTL to buy whatever they want. However, while it’s intended that MTL will be used purely as a day-to-day spending currency, it’s an ERC-20 token, so you can easily exchange it with currencies until the day comes when you can spend it directly.

Pundi X

Pundi X is a payment processor for merchants that makes it as easy as possible to receive cryptocurrencies.

So far, they have launched a device that sits on top of a counter — presumably beside a cash register — where users can interface with the Pundi X platform using QR codes, RFID chips, or cards. In the future, they plan to have self-checkout machines and other tools to help facilitate consumer retail purchasing.

They also provide a software system to help organize the payment information that comes in as users spend cryptocurrency at stores using the Pundi X system.

Pundi X covers all aspects of the merchant side of a retail sale, so that merchants can essentially drop it into place and be ready to accept just about any major cryptocurrency. They also provide all the functionality to receive non-crypto payments like cash and credit, so that merchants don’t have to deal with 2 parallel systems, but can have one integrated, all encompassing system for sales and services to receive crypto.

Interestingly, Pundi X is developed in Indonesia, a country that currently does not allow cryptocurrency to be used as a currency. Pundi X nonetheless have distributed their system among thousands of retailers there, which means that they have stress tested their regular fiat payment systems.

A retailer using their platform can be confident that their regular cash and credit business will be handled correctly, so that when it comes time to add in crypto, nothing will be sacrificed.


BillPay has a novel approach to getting cryptocurrency into your life. BillPay wants you to ship them your coins — as in your literal metal coins you use as spare change — and they will convert them into crypto. You can then use crypto amounts to pay bills.

BillPay makes their money by charging 1.99% on transactions, and you also have to factor in the amount it costs to ship them your coins. They only take US coins and operate within the United States, so there’s no international shipping costs, but still, it can’t be too cheap to ship metal around within the country.

Assuming you get your coins to them for a price you can live with and their service fee is acceptable, then a question that comes up is: how many bills can be fully covered by the amount of spare change one has lying around? They say they can handle payments of up to US$9,000, which is a lot of spare change.

Probably the most appealing aspect of BillPay is the prospect of taking spare coinage and converting it into crypto. Many people make a savings habit out of siphoning spare change from their wallet into a separate fund that can be used to buy fun or useful things at the end of the year.

Having crypto as one option for what to do with those coins that get set aside might be more appealing than covering the phone bill.

Living Room of Satoshi

Living Room of Satoshi is a bill payment system similar to BillPay, but it is only in Australia, which is unfortunate because after many years in the game, Living Room of Satoshi has built a reputation as being solidly reliable.

Living Room of Satoshi started out with just Bitcoin, back when Bitcoin was essentially the only coin available but they’ve kept up with the evolving crypto world, and now accept roughly a dozen of the most common cryptocurrencies.

Living Room of Satoshi does not charge fees. Instead, they make their profits entirely on the spread of exchange rate between crypto and Australian dollars. Which means that when you pay your bills, your money won’t go quite as far as if you paid directly in fiat. But then again, all services charge for convenience, so this isn’t really anything that should cause concern.

You can also pay your rent and other substantial obligations with Living Room of Satoshi, which really gets close to realizing a life where you forsake fiat cash for crypto entirely.


CitiCash is a cryptocurrency that aspires to be used every day like any fiat currency. To get there, instead of offering incentives or services, they have put their focus almost entirely into ease of use.

They’ve created the CitiCash wallet with simplicity and intuitiveness as the primary drivers behind all design decisions, and you would be hard pressed to find a wallet that does a better job at providing the essential information in a clear, easy-to-access way. They also have plans to create debit cards, which will facilitate payments in a manner already very familiar to a large part of the world.

The CitiCash wallet is strictly for storing their token, CCH, so the extent to which you’ll be able to use it in your daily life will depend entirely on how far they can get merchants and people to take it on. In one sense, this could be a challenge, as there are already many crypto tokens vying to be everyone’s money for daily use.

On the other hand, there are many retail systems, such as Pundi X, listed above, which make it possible for merchants to accept any and all cryptocurrencies, automatically changing the received crypto into whatever token the merchant prefers.

If such systems become ubiquitous, then there is less need for consumers to all buy into the same currency. All things being equal, they may very well opt to go with the currency that’s easiest to use, and that will be defined in a big way by the wallet.

To get the wallet and learn more about the CitiCash token sale, check out their website.


MenaPay is another project trying to make cryptocurrency a reality in day-to-day life. They offer both a wallet for consumers and payment processing software for merchants to get both sides of the retail process on board with using crypto.

They’re based largely in the Middle East, and, for those who might want it because of religious concerns, they offer a “100% Islamic” service.

While this gives MenaPay an advantage in their home region when dealing with banks and local regulations that may have different constraints than other parts of the world, for people in other regions who don’t adhere to Islamic practices, the fundamentals of buying and selling with MenaPay are as standard as any other cryptocurrency.

As you investigate MenaPay, you will notice a lot of the same features and promises offered by many other cryptocurrencies, such as ease of use, lower fees, faster transactions, and so on.

However, there is one standout feature that makes Menapay very different from the rest.

They are pegging the value of their token, MenaCash to the US dollar, so that it is not subject to the same price fluctuations as other cryptocurrencies that rely on market forces to establish their value.

This may raise some eyebrows, especially when Tether, the digital currency pegged to the US dollar that created as a way to make purchasing cryptocurrency easier, has become notorious for a lack of auditing that proves they actually have the funds to back their claims. Any currency basing their value on a peg to a fiat currency is going to have to contend with the legacy of perception created by Tether.

The difference between MenaPay and Tether, though, seems to be that Tether exploded in terms of issuance, raising suspicions about whether the amount needed to back the amount in circulation could have been acquired as fast as it was. MenaPay, however, is working with established banks, and has backers in the region with substantial funds to offer.

And maybe most importantly, the amount they need to hold in trust is substantially smaller than what Tether would require. The aims of MenaPay are considerably more feasible.

Nonetheless, unlike most cryptocurrencies that aspire to be as trustless as possible, MenaPay will require some faith that the funds backing it exist. Once that faith is established, then Menapay could make purchasing in regions where banking can be complicated become a lot smoother.

Cryptocurrency bitcoin marks 10 years

October 31, 2008 marked the birth of bitcoin. Ten years on, the world’s first cryptocurrency is at the forefront of a complex financial system viewed warily by markets and investors.

From its first evocation amid a , in a written by Satoshi Nakamoto, an unknown pseudonym, conveyed a political vision.

The “abstract” set out in the paper for bitcoin, currently worth about $6,400 per unit from a starting point of virtually zero, was for “a purely peer-to-peer version of electronic cash (that) would allow online payments to be sent directly from one party to another without going through a financial institution.”

A decade on, this continues to be carried out via a decentralised registry system known as a blockchain.

Such ambition for a cryptocurrency was fuelled by the bankruptcy of US investment bank Lehman Brothers in September 2008, an event that discredited the traditional system of “a small elite of bankers… (that) establishes monetary rules imposed on everybody”, according to Pierre Noizat, founder of the first French bitcoin exchange in 2011.

Following its creation, bitcoin evolved for several years away from the public eye, grabbing the attention for the most part of geeks and criminals—the latter seeing it as a way to launder money.

After bitcoin surpassed $1,000 for the first time in 2013, it began to attract the attention of financial institutions.

The European Central Bank compared it to a Ponzi scheme, but Ben Bernanke, then head of the US Federal Reserve, hailed its potential.

A turbulent childhood

In early 2014, the cryptocurrency faced its biggest crisis to date, with the hacking of the Mt. Gox platform, where about 80 percent of all bitcoins were traded.

The result was a collapse in their value, leading to predictions of the virtual currency’s death.

It took until early 2017 for bitcoin’s price to fully recover.

A technician inspects the backside of bitcoin mining at Bitfarms in Saint Hyacinthe, Quebec
A technician inspects the backside of bitcoin mining at Bitfarms in Saint Hyacinthe, Quebec

That marked the start of a “turning point” according to Noizat, as the controversial cryptocurrency then rocketed to more than $19,500 by the end of the year according to Bloomberg data.

That meant bitcoin had a total capitalisation of more than $300 billion, according to the specialised website Coinmarketcap.

By January 2018 the value of all cryptocurrencies exceeded $800 billion, before the bubble burst.

The concept of a digital currency has progressed substantially thanks to bitcoin, cryptocurrency analyst Bob McDowall told AFP, pointing to the creation of 2,000 rivals.

“It becomes more than a technological, economic innovation. It almost becomes a religion for some people,” he noted.

According to Anthony Lesoismier, co-founder of investment fund Swissborg which offers portfolios based on blockchain, “the real revolution has been on a philosophical level”.

But for economist Nouriel Roubini, decentralisation in crypto is a myth.

“It is a system more centralised than North Korea. Miners are centralised, exchanges are centralised, developers are centralised dictators,” Roubini tweeted.

If the initial idea was for bitcoin to facilitate payments, a majority of observers recognise that it is used above all as a store of value or as a speculative instrument owing to volatility in its value.

“You need 20 years for this kind of… technology to take hold completely,” said Noizat, who is banking on faster transaction speeds for bitcoin.

As it stands, about five to ten bitcoin transactions can be processed per second compared with several thousand for Visa cards.

Looking ahead, US market regulators are considering applications for bitcoin-based exchange-traded funds, which if approved by the Securities and Exchange Commission would see the become part of a financial system it set out to bypass.

“We must cross some bridges in the short term” to generate the general public’s interest and trust, said Lesoismier, who described himself as both an “idealist” and “realist”.

AT&T Extends Broadband Connectivity Across Rural Alabama

AT&T Inc. T recently expanded its Fixed Wireless Internet service across the rural and underserved locations in Alabama. With this endeavor, the company has rolled out broadband connectivity in 18 states across the United States.

Through Fixed Wireless Internet service, AT&T will offer an Internet connection with download speed of at least 10Mbps and upload speed of at least 1Mbps. The connectivity is facilitated through a wireless tower and is routed via a fixed antenna placed on the customer’s home. This cost-effective Internet connection is arguably one of the best methods to deliver high-quality faster broadband to customers in underserved rural areas.

Since 2015, AT&T has invested about $1.2 billion to improve the wireline and wireless networks in Alabama. The company will use fiber optics to connect the wireless towers with its dedicated global network to deliver the Fixed Wireless Internet service. AT&T has an extensive coverage of fiber optics in Alabama with more than 2 million strand miles of fiber optics spanning the state.

With the inclusion of Alabama, AT&T presently offers this service to more than 440,000 locations across 18 states and further aims to provide Internet access to above 1.1 million locations by the end of 2020. In addition, the company is gearing up to launch the first standards-based 5G services to consumers in multiple U.S. markets by the end of 2018. AT&T has been working hard since 2017 to lay the foundation for mobile 5G network and has completed network upgradation in 23 major cities.

AT&T’s wireless growth opportunities from the launch of standards-based mobile 5G services and the FirstNet project remain impressive. Notably, completion of the 3rd Generation Partnership Project’s first implementable 5G new radio (NR) specification has set the stage for the global mobile industry to start full-scale development of 5G NR for large-scale trials and commercial deployments in 2019.

Despite the positives, AT&T has recorded an average decline of 5.5% in the past six months while the industry has rallied 6.7%. It remains to be seen how the recent initiatives by the company help its share price performance in the long run.

USDA Invests $600 Million In Rural Broadband, But Farmers Still Struggle To Connect

Late last month, the U.S. Department of Agriculture announced plans to add $600 million to fund e-Connectivity, a pilot program aimed at bridging the rural digital divide by improving broadband internet access for American farmers. But the rural digital divide is wider than ever, as farmers struggle to run tech-dependent businesses without broadband.

According to a 2016 Federal Communications Commission (FCC) report, 39% of rural Americans don’t have broadband internet access but Daiquiri Ryan, a policy fellow at the non-profit Public Knowledge, says that number is almost certainly inaccurate.

“All of that data that the FCC collects…is self-reported by internet providers and it’s only done by the census block, [which] means if one person on the census block is served by that provider…the entire census block is considered served.” But in very rural, sparsely populated areas, says Ryan, that one house with service might be the only one with actual service for miles, so when the entire area of the map shows up as served, it’s not an accurate picture.

Worse yet, though members of Congress often complain about the lack of good quality data during rural broadband hearings, Ryan says no one seems to want to fix the problem. “[T]hey’re not doing much to fund better data collection, [which is] the first hump.”

In today’s farming operations, almost all aspects of production are enhanced by technology. “We have technology in our processing [operations], our feed mill systems” as well as trucks and tractors, explains Shawn Tiffany, owner and operator of Tiffany Cattle, a cattle feedlot located in Herington, Kansas. Tiffany happens to be a strong proponent of traditional farming techniques like cover cropping, but his cattle farm also relies on computers for tasks like calculating the feed for the cows and for tracking customer data.

Consumers may assume technology is only used on industrial-size farms, but that’s not true. Virtually all farmers today rely on technology in their day-to-day work, including those who run smaller, more boutique operations. Brian Fiscalini is one of the owners of Fiscalini Cheese Company, a farmstead cheesemaking and farming operation located in Modesto, California, and he recently spoke at the International Food Information Council Food Innovation Summit in Washington, D.C., to say, amongst other things, yes, even artisanal cheese makers rely on technology.

Jodi DeHate, who grew up on a dairy farm and today works with a number of dairy producers in rural Michigan, says today’s dairy farms are sophisticated operations, using sensors to detect everything from butterfat content to whether a cow might be sick. Even though she has the experience to tell when a cow isn’t feeling well, the technology provides more precise data, which helps guide farmers in making difficult decisions about their cows’ lives and welfare.

On the farm, spotty or limited internet coverage can be a costly problem. “Whether you’re on Wall Street or out here in the middle of Kansas, data equals dollars,” says Tiffany. Tiffany also sits on the board of directors for local rural telecom provider Tri-County Telecom Association or TCT, whose coverage relies on a fiber optic network. Tiffany’s original feedlot is in Herington, so computers at that location run on that fiber optic network, but his newer feedyard falls outside of TCT’s coverage area. As a result, he says, the second feedlot’s wireless system just isn’t as good. In the past year, he’s made frequent calls to a technician to try and get the system working, but it continues to be an ongoing problem.

In Michigan, DeHate says it’s a fact of rural farming life now that virtually all of the local equipment dealers have to have employees on staff dedicated solely to tech support. And farmers often find themselves having to get creative, MacGyver-ing their way to tech solutions by taking “the flash drive from their from their tractors…and [downloading]…information [onto] the computer in the house or vice versa.”

Coverage over the vast but sparsely populated areas in which DeHate lives and works is a constant challenge, especially since her work covers four different rural counties. “Once you get past the little villages or towns, [it’s] either wireless…or you’re stuck with satellite or [using] your phone as a hotspot.” DeHate has heard from farmers using the Wi-Fi at their local McDonald’s just to get their work done.

Lack of service doesn’t just impact the farm’s business, but the farmer’s family, too. Kids might be able to check out iPads or computers from school, but without good wireless service, the computers won’t work. DeHate says it’s especially worrisome if you can’t get service and you’re out working in the field, because “if you do have an accident, you [have to] hope you can actually walk somewhere or…make sure somebody can get to you.”

From a policy perspective, Public Knowledge’s Ryan says there are multiple roadblocks to getting rural Americans the coverage they need: “A lot of carriers don’t want to build out to these areas, because it’s very expensive, and it doesn’t turn a profit for them.” Worse than that, says Ryan, in many places the existing copper is rotting and there’s a backlog of repair requests. “[People call and are told] we only have five technicians for the entire state, so it’s going to take a couple of weeks.” Ryan says the FCC has rolled back a number of regulations and now “there’s really no complaint process.”

Tiffany, who sits on the board of a rural telecom provider and is familiar with its business challenges, is sympathetic to the position rural carriers find themselves in as it’s tough to make a financial case for some of these services when the return on investment just isn’t there. That’s why the government funding was put in place to begin with, but Tiffany is worried about the new farm bill funding provisions, saying they seem to favor wireless technology rather than an investment in the necessary infrastructure. “I’m not going to say we’ll never get fiber but it’s going to make it very very difficult to put [new] fiber in [at this point].”

Ryan says the USDA has at least moved closer to implementation with new funding to its e-Connectivity program, and one good feature is that almost any entity can apply, including telecom providers and municipalities. But it’s also true that investment in underlying infrastructure is still lacking. According to Ryan, “[the] trend right now, especially with the the Senate majority and the House majority in the GOP is to say things like satellite service is the future. 5G is the future, [or] wireless internet [is the future but]…we know [that] in real life…a wireless connection [isn’t] the same quality of service as you get over a fixed line.”

Rural Maine communities taking lack of broadband into their own hands

They’re tired of waiting for the private sector – and state and federal governments – to bring them up to speed.

Philadelphia residents Wayne and Katy Kach were hoping to move to the Prospect area in rural Maine to escape their cramped surroundings and be closer to family members, but it isn’t going to happen in the foreseeable future.

Like many urban professionals whose desire for more bucolic surroundings could benefit rural Maine communities, the couple’s relocation plan was a nonstarter because both require reliable high-speed internet access to do their jobs. The area where they want to live doesn’t offer it.

“We were hoping to find a place that would be able to afford us a little bit of land,” Wayne Kach said. “We found a few properties in the Prospect area, but it just kind of got shut down because there’s just no way for us to work.”

Many rural communities in Maine have been waiting decades for the major internet service providers to bring broadband service to their areas, a situation exacerbated by the state having the second slowest internet speeds in the country. The lack of broadband is a deterrent to would-be residents and businesses, and it thwarts local efforts at economic development. It also deprives existing residents of opportunities for entertainment, education, employment and digital health services.

Proponents of broadband expansion in Maine say rural areas have been left behind because internet service providers don’t see a financial benefit to upgrading their rural networks, the state lacks strong leadership to push comprehensive broadband initiatives, and many rural residents still don’t understand why broadband service is important to their communities.

The issue is finally gaining some political traction at the statewide level. All four of Maine’s gubernatorial candidates have said expanding rural broadband service is a top priority, although their proposed solutions differ.

Maine consistently places at or near the bottom of national rankings for internet connectivity, and the primary reason is the state’s poorly connected rural areas.

Roughly 15 percent of Maine residents still don’t have access to broadband service as defined by the federal standard of at least 25 megabits per second download and 3 Mbps upload, said Peggy Schaffer, who runs the Maine Broadband Coalition, an informal federation of public policy professionals, educational institutions, businesses, nonprofit organizations and private individuals seeking to improve broadband access in the state. At 25 Mbps, a single user can engage in moderate internet usage such as streaming high-definition video, or multiple users can engage in light usage such as streaming music or browsing the internet.

Most of the roughly 200,000 Mainers without broadband access live in rural communities. At least 20,000 of them have no internet access at all.

The crisis is real. It affects out-of-work residents who can’t search for jobs online, home-based business owners who can’t connect digitally with customers, students who can’t complete homework assignments from home, and seniors who can’t rely on potentially lifesaving online health services.

But a growing number of rural towns are no longer satisfied with waiting for the private sector to bring them up to speed. Despite inadequate help from the state and federal governments, a handful of communities in Maine are working on taxpayer-subsidized broadband infrastructure projects that could serve as a model for the rest of the state. Another 50 or so towns are trying to drum up public support for broadband projects of their own.

One innovative project in the St. Croix Valley would create Maine’s first publicly owned broadband network with providers competing for customers at gigabit speed, and others would provide a similar level of service through public-private partnerships.

“There’s a lot of talk about ‘We need broadband,’ but the issue is how we do it, and that is on many levels,” Schaffer said. “One of them is money, and another one is structure: What does the structure look like for us doing broadband? And there’s a lot of different opinions about that.”


Rural Maine resident David Reed said the internet service in his community is so slow and unreliable that he and his wife are thinking about leaving town.

“I envy people who can get a wired service at all, even if it’s ‘just’ DSL,” Reed said. “I live on the coast, south of Bangor outside Belfast in Swanville, and I can’t get anything at all, not even DSL. We are looking to sell the house and move, if the right opportunity presents itself, because of lack of broadband.”

Reed said his current internet service, a fixed wireless service provided by UniTel, offers a maximum download speed of about 8 Mbps – when weather conditions are ideal. The rest of the time, download speeds range from 1 to 4 Mbps, he said.

Reed, who works in information technology, said not having reliable internet service has made it more difficult for him to do his job. He said the situation has been even more problematic for his wife, who had tried to start a home-based business involving online education and training but ultimately had to abandon the idea.

“It just didn’t work – she would spend all day trying to get through some of the training, and maybe get an hour’s worth out of a full eight- or 10-hour day,” he said. “Eventually she ended up going back to work for somebody else.”

Population density is a big factor in determining whether the big internet service providers such as Spectrum, Xfinity and Consolidated Communications will invest in upgrading a community’s communications infrastructure to accommodate wired broadband service, said Julie Jordan, director of Downeast Economic Development.

Just because a community is wired for telephone and cable TV service does not mean it has the required infrastructure for broadband internet service. Adding broadband requires a significant investment in additional equipment to boost the signal.

In most of rural Maine, the big providers simply don’t think there is any financial upside to providing broadband service, Jordan said, and they are not legally required to do so.

Jordan’s organization serves the St. Croix Valley area, which includes Calais, Baileyville and surrounding communities. She said the average residential internet download speed in her community is about 3 Mbps, and the maximum is about 10 Mbps.

Jordan said one of the problems for rural communities is that it is difficult to convince private companies to upgrade their infrastructure when the area already has internet service, even though the existing service isn’t very good. There are multiple internet providers in the St. Croix Valley, she said, but all of them provide service that is slow and unreliable.

“But because we did have service, we were overlooked as far as getting any kind of grant money or encouraging providers that were here to offer better broadband,” Jordan said. “You kind of get the same old story: ‘Be happy with what you have.’ ”


Ellsworth resident and small-business owner Emily Shaffer said she loves her rural community and has no intention of leaving, but Shaffer said the lack of broadband service has made it more difficult for her to conduct business.

Shaffer, who designs and sells custom jewelry, said it is a slow and tedious process to do certain things such as uploading images of new designs to her website.

“It’s a pain, but it doesn’t stop me from doing business,” she said. “It’s just time and aggravation with things that I would otherwise be able to just click and go.”

The reluctance of big internet providers isn’t the only obstacle preventing rural Maine from joining the modern communications age, said Carla Dickstein, senior vice president of research and policy development at Brunswick-based community development financial institution Coastal Enterprises Inc., one of the founders of the Maine Broadband Coalition. Some small communities in Maine don’t even push for broadband because they don’t see the value in it, she said.

“Part of the problem is that the communities aren’t seeing what the future is,” Dickstein said. “I don’t think everyone is seeing why it (broadband) is so important, and if you don’t see that, then it’s hard to get your town mobilized and to make it a priority.”

In order to receive broadband service, small municipalities need to put “skin in the game” by helping to fund local infrastructure projects through bond sales and other means, even if it’s a small amount to begin with, she said.

Funding is a challenge, Dickstein said, but there is actually quite a bit of financial assistance available through state and federal grants that many rural communities aren’t even trying to tap into.

“I think the challenge is the lack of understanding of the future even more so than the lack of money,” she said.

In some communities, getting broadband service will require a team of local leaders who are willing to go door to door and sell individual residents on the value of broadband service and why they should agree to spend some of their tax dollars on infrastructure, Dickstein said.

Another obstacle is a lack of strong leadership at the statewide level to promote rural broadband expansion, said Nick Battista, policy officer at the Island Institute, an economic and community development organization based in Rockland that has been assisting small island and coastal communities with local broadband projects.

“(State broadband authority) ConnectME is doing what they can with a limited amount of funds, but we need to be investing closer to $50 million to $100 million a year on rural broadband, rather than $1 million a year, in public funds to solve this problem,” Battista said. “Communities are recognizing that it’s an economic and social imperative – they need it for health care, for telemedicine, for education. You have kids who are going to the library after the library closes and sitting outside in the parking lot just to do their homework. That’s not how we build a strong state.”


There is hope on the horizon, but significant progress is going to require buy-in from Maine’s rural residents, said Fletcher Kittredge, CEO of GWI in Biddeford, one of the Maine-based internet service providers that have been working with rural communities on local broadband projects.

The number of rural towns where a majority of residents favor investing in broadband infrastructure is increasing, he said, and there are already a handful of municipal broadband projects in various stages of completion.

One example is Islesboro, an island community where the residents voted overwhelmingly in 2016 to invest in a $3.8 million broadband network that connects via an underwater fiber-optic cable to the statewide Three Ring Binder fiber-optic network.

Kittredge said the project took some convincing at first, but now more than 80 percent of Islesboro residents use the service, operated by GWI, which offers a 1 gigabit download speed for just $35 a month.

“It was an incredibly interesting project and we knew we were going to sink a lot of time into it … because it was the first one,” he said. “We’re incredibly pleased with the way it turned out because it could be a model for (success). I think it’s made an enormous difference for that community.”

The project required Islesboro to issue bonds to pay for the network’s construction, and then it contracted with GWI to build, own and operate the service. It’s a model that GWI and other small internet service providers are trying to replicate in other parts of the state.

“Right now, we’re looking at doing the towns of New Sharon, Dixfield and Blue Hill, and we’re desperately trying to go around getting people in those towns to affirmatively say, ‘If you build this, we would be interested,’ ” Kittredge said. “Because that’s a condition of applying for subsidized federal loans.”

An even more ambitious project is the Downeast Broadband Utility, a regional effort in the St. Croix Valley that, if successful, would create Maine’s first independent, publicly owned broadband network.

In partnership with Calais and Baileyville, Downeast Economic Development is working on a project to connect about 3,000 area households with gigabit fiber-optic internet service, said Jordan, the economic development group’s director.

Unlike any other broadband network in Maine, the Downeast Broadband infrastructure would be owned by the community, and internet service providers would pay to lease bandwidth on the network. A handful of providers have expressed interest, she said, but so far none have made binding commitments.

If successful, the rural community would have the only fiber-optic broadband network in Maine with multiple providers competing for business. If it fails, the community will have spent up to $3.1 million of taxpayer money building a network that no provider wants to use. It’s a risk area residents were willing to take.

“We’re rural, and we want to stay competitive and vibrant and become part of the rest of the world,” Jordan said.

How To Build A Connected City

Getting to be a Smart City first requires getting connected. A Connected City is a city or community that has the network infrastructure (fiber optics, wi fi, small cell, towers) that allows for the efficient exchange and collection of information (voice, data, video) via a variety of devices both public and private (sensors, cameras, phones, traffic signals).

Street furniture such as light poles, bus shelters, kiosks, waste containers and other street level infrastructure is ideal for deployment of next generation (5G) telecommunications gear. The goal is to get the antennas closer to the user.

Smart Cities encourage the growth of telecom infrastructure by creating both a policy and investment environment that facilitates growth. There are a limited number of stakeholders in each city and community that have developed networks.

Stakeholders that usually control the majority of network assets in a city or community include: Utilities, Cableco’s, Cellular Carriers, CLEC’s and Telephone Companies, Transportation Authorities and Public Safety Departments.

The city generally plays 3 roles in developing a Connected City since much of the investment will be borne by the Private Sector.

  • Regulatory and Policy Administration
  • Consumer of Network Technology
  • Owner and Provider of Network Technology and Street Furniture

Many Cities struggle to ensure broadband access to all consumers. Digital Inclusion initiatives involve everything from multi-stakeholder PPP’s to the National Internet Essentials program provided by Comcast. Reaching rural communities continues to be difficult.

The City’s Role in Developing Network Infrastructure

Telecommunications is a core pillar of Smart City Infrastructure and requires an ecosystem of public private cooperation to maximize its impact. Cities that actively engage in creating a Connectivity friendly environment and develop a Broadband Strategy deliver significant benefits to its citizens. Aligning with the utilities that often own most of the pole infrastructure is a key step.

It is no longer necessary to convince Cities that Connectivity is vital. The challenge is in funding and executing. Network construction is capital intensive and requires an operating budget each year. Maintaining networks requires specialized equipment and skill sets most cities are not budgeted to support. The collaborative approach has proven to be successful. Policy models exist that have been driving investment and growth.

The City of San Jose, California recently provided this look at their Broadband Strategy for a Smart Cities Council event in Silicon Valley. They looked at how to ensure broadband connectivity was ubiquitous and affordable. Their model projects the benefits of collaboration between the public and private sectors.

broadband strategy

San Jose started to put their plan into action in early 2018. As part of the plan, they took an aggressive stance with the 2 largest wireless carriers, AT&T and Verizon. They understood the importance to the carriers of timely deployments on City or Utility owned infrastructure.

San Jose successfully negotiated customized agreements with each carrier that ensures dense network technology would be deployed in San Jose to service the main business districts and residential areas. They also were successful at getting the carriers to create a set aside fund to enable the city to provide network services to underserved markets of the city to create digital equity.

The City looked at its pole infrastructure and determined there were going to be 3 primary uses. The first use was as a traditional streetlight that would be upgraded to LED to meet sustainability goals. Second, the poles are critical to the broadband infrastructure needed to build out 4G and 5G. Third, poles are now being used to mount more and more things such as cameras, sensors and advertising. Connecting these things is called the Internet of Things.

First, the increased broadband collaboration lead to new capability. San Jose developed a Demonstration Partnership Policy, which was established to support its Economic Development Strategy and city operations.

Under this policy, a SmartPole pilot project is being deployed. San Jose is working with Philips Lighting and local utility, PG&E to deploy 50 SmartPoles, with energy-efficient, wireless-controlled LED lighting. The city estimates that the LEDs will provide the city with a 50-percent energy savings compared to conventional streetlights.

Second, The SmartPole also offers built-in 4G LTE small cells. This enhanced coverage serves both the citizens of San Jose as well as provides capacity for IoT applications that the city may choose to implement. Philips also collaborated with PG&E to design a two-way communicating meter that sits on top of the SmartPole, rather than requiring the typical pedestal meter on the ground, thereby reducing street clutter.

broadband streetlight

Other cities have been collaborating with Utilities and Pole Manufacturers. The City and County of Denver recently released this guide to deal with small cells and pole attachments.

They collaborated with utility provider, Xcel, Verizon, Aero Wireless and Jacobs Engineering to ensure a comprehensive look at both existing poles and new deployments. The results included a new design that will first accommodate 1 small cell carrier but is being expanded to include 2 carrier’s gear. Denver County Engineer, Jon Reynolds who led the effort commented, “At first we were talking different languages but once we established common goals, we were able to develop a few solutions that we are all pleased with.”

Jim Lockwood, President of Aero Wireless said, “Developing a new pole design that accommodates small cells was a priority for us all. We are seeing demand in most major markets now.”

small cell deployment

Boston also took a collaborative approach and asked for submission of different designs. After consulting with all stakeholders, the City has begun to approve certain designs and make them available to all parties interested in deploying new poles. Mike Lynch, Director of Broadband for the City of Boston said, “There are a lot of new regulations being implemented and the demand for Small Cell deployments is rising fast. We got ahead of the design process and are approving most permits well under the timeline guidelines.”

cooperative design

Scroll to top