Shyam's Slide Share Presentations

VIRTUAL LIBRARY "KNOWLEDGE - KORRIDOR"

This article/post is from a third party website. The views expressed are that of the author. We at Capacity Building & Development may not necessarily subscribe to it completely. The relevance & applicability of the content is limited to certain geographic zones.It is not universal.

TO VIEW MORE CONTENT ON THIS SUBJECT AND OTHER TOPICS, Please visit KNOWLEDGE-KORRIDOR our Virtual Library

Saturday, June 24, 2017

Blockchain explained 06-25














Why it matters: Like the internet in its early years, blockchain technology is hard to understand and predict, but could become ubiquitous in the exchange of digital and physical goods, information, and online platforms. Figure it out now.

What is a blockchain?

Blockchain is a term widely used to represent an entire new suite of technologies. There is substantial confusion around its definition because the technology is early-stage, and can be implemented in many ways depending on the objective.

“At a high level, blockchain technology allows a network of computers to agree at regular intervals on the true state of a distributed ledger,” says MIT Sloan Assistant Professor Christian Catalini, an expert in blockchain technologies and cryptocurrency. “Such ledgers can contain different types of shared data, such as transaction records, attributes of transactions, credentials, or other pieces of information.

The ledger is often secured through a clever mix of cryptography and game theory, and does not require trusted nodes like traditional networks. This is what allows bitcoin to transfer value across the globe without resorting to traditional intermediaries such as banks.”

On a blockchain, transactions are recorded chronologically, forming an immutable chain, and can be more or less private or anonymous depending on how the technology is implemented. The ledger is distributed across many participants in the network — it doesn’t exist in one place. Instead, copies exist and are simultaneously updated with every fully participating node in the ecosystem. A block could represent transactions and data of many types — currency, digital rights, intellectual property, identity, or property titles, to name a few.

“The technology is particularly useful when you combine a distributed ledger together with a cryptotoken,” Catalini says. “Suddenly you can bootstrap an entire network that can achieve internet-level consensus about the state and authenticity of a block’s contents in a decentralized way. Every node that participates in the network can verify the true state of the ledger and transact on it at a very low cost. This is one step away from a distributed marketplace, and will enable new types of digital platforms.”

How is blockchain related to bitcoin?

Bitcoin, with a market cap of more than $40 billion, is the largest implementation of blockchain technology to date. While a lot of media attention has shifted from bitcoin to blockchain, the two are intertwined.

“When The Economist put blockchain on the cover in 2015, it wasn’t really about its use to support a digital currency anymore. It was all about the other applications this technology will unleash within the next 5 to 10 years,” Catalini says. “For example, in finance and accounting there is excitement about the ability to settle and reconcile global transactions at a lower cost using the technology. In logistics the attention is all on how you can use the immutable audit trail generated by a blockchain to improve the tracking of goods through the economy. Others are fascinated by the possibility to use this as a better identity and authentication system.”

There are two types of costs blockchain could reduce for you: the cost of verification and the cost of networking. 
 
So what’s the big deal?

In a recent paper, Catalini explains why business leaders should be excited about blockchain — it can save them money and could upend how business is conducted.

Every business and organization engages in many types of transactions every day. Each of those transactions requires verification. In many cases, that verification is easy. You know your customers, your clients, your colleagues, and your business partners. Having worked with them and their products, data, or information, you have a pretty good idea of their value and trustworthiness.

“But every so often, there’s a problem, and when a problem arises, we often have to perform some sort of audit,” Catalini says. “It could be actual auditors coming into a firm. But in many other cases, you’re running some sort of process to make sure the person claiming to have those credentials did have those credentials, or the firm selling you the goods did have the certification. When we do that, it’s a costly, labor-intensive process for society. The marketplace slows down and you have to incur additional costs to match demand and supply.”

“The reason distributed ledgers become so useful in these cases is because if you recorded those attributes you now need to verify securely on a blockchain, you can always go back and refer back to them at no cost,” he says. “It’s costless verification. So when you think about why bitcoin works, it’s because it can cheaply verify that the funds are actually there. You can transfer value from here to anywhere on the globe at almost zero transaction cost. Sending secure messages that carry value does not require a bank or PayPal in the middle anymore.”

In short: Because the blockchain verifies trustworthiness, you don’t have to. And the friction of the transaction is reduced, resulting in cost and time savings.

Using a blockchain can also reduce the cost of running a secure network. This will happen over a longer timeline, Catalini says, perhaps a decade. The internet has already allowed for a faster, less stilted exchange of goods and services. But it still needs intermediaries, however efficient they may be — think eBay, Airbnb, and Uber.

“Those intermediaries are costly and earn rents for processing payments, maintaining a reputation system, matching demand and supply,” Catalini says. “This is where blockchain technology,
combined with a cryptotoken, allows you to rethink an entire value chain from the ground up.

That’s where incumbents should be slightly worried, because in the long run the way you may be delivering value to your customers and competing against other companies could be fundamentally different.”

Blockchain technology could mean greater privacy and security for you and your customers.
                 
Catalini calls it data leakage. When you give a bartender your driver’s license, all that person needs to know is your age. But you’re revealing so much more — your address, your height, whether you’re an organ donor, etc.

The same thing happens in commercial transactions.

“As your business partner, I need to know that you’re trustworthy and reliable, but for simple transactions I don’t really need to know many other things about you,” Catalini says. “Information disclosure is increasingly becoming a cost because of data breaches. We can’t keep our data private and it’s becoming increasingly complex to do so within large organizations. So imagine a model where you can verify certain attributes are true or false, potentially using a decentralized infrastructure, but you don’t have to reveal all these attributes all the time.”

In a business transaction context, Catalini says, a blockchain could be used to build a reputation score for a party, who could then be verified as trustworthy or solvent without having to open its books for a full audit.

“Reputation scores both for businesses and individuals are today siloed into different platforms, and there is very little portability across platforms. Blockchain can improve on this,” he says.

Which industries could blockchain disrupt?

“All of them,” Catalini says. “The technology is what economists call a general purpose technology, and we will see many applications across different verticals.”

Here are a few to keep an eye on.

Central banks: Many central banks — including those in Canada, Singapore, and England — are studying and experimenting with blockchain technology and cryptocurrencies. The potential applications include lower settlement risk, more efficient taxation, faster cross-border payments, inter-bank payments, and novel approaches to quantitative easing. Imagine a central bank stimulating the economy by delivering digital currency automatically to citizens. Don’t expect big moves from big countries soon. The risk is too high, Catalini says. But expect to see smaller, developed countries with a high tolerance for technology experimentation lead the way and possibly experiment with a fiat-backed, digital currency for some of their needs.

Finance: The busiest area of application so far, blockchain is being used by companies seeking to offer low cost, secure, verifiable international payments and settlement. Ripple is  one of the leaders in this space on the banking side. Meanwhile, companies like Digital Asset and Chain seek to create a faster, more efficient financial infrastructure for tracking and exchanging financial assets of any type.
Money transfer: In 2014, two MIT students raised and distributed $100 worth of bitcoin to every MIT undergraduate.

They wanted to see what would happen and generate interest on campus. Catalini, together with Professor Catherine Tucker, designed the experiment and studied the results. While 11 percent immediately cashed out their bitcoin, 49 percent were still holding on to some bitcoin. Some students used the funds to make purchases at local merchants, some of whom accepted bitcoin. Others traded with each other. Meanwhile, startups around the world competed to become the consumer trading application for bitcoin.

Then PayPal bought Venmo, a payment platform that trades cash. PayPal’s own mobile app allows for peer-to-peer transactions, as well. The bitcoin-based consumer payment industry cooled down. But the application of blockchain remains attractive because of the lower costs it could offer parties in global, peer-to-peer transactions. Rapid payment company Circle, which advertises itself as “Like a text filled with cash,” stopped allowing users to exchange bitcoin last year, but is building a protocol that allows digital wallets to exchange value using a blockchain.



.
.
Micropayments: What if, instead of subscribing to a news site online, you paid only for the articles you read?

As you click through the web, your browser would track the pages and record them for payment. Or what if you could get small payments for doing work — completing surveys, working as a freelance copy editor — for a variety of clients. By reducing the cost of the transaction and verifying the legitimacy of parties on either end, blockchain could make these micropayments, new types of cross-platform subscriptions, and forms of crowdsourcing possible and practical. A company called Brave is already attempting this, with potential ramifications for the digital advertising industry.

Identity and privacy: In October 2013, the arrest of the founder of Silk Road, a deep web marketplace where users paid for illegal goods with bitcoin, showed just how anonymous bitcoin really wasn’t. Nor was it ever intended to be — bitcoin addresses function much as a pseudonym does for a writer, Catalini says. Users can never completely mask their transactions. But others are trying. Zcash promises to be a fully private cryptocurrency.

There are significant downsides to the anonymity a blockchain could offer, such as the ability to fund terrorism or facilitate money laundering. But there are many virtuous applications too — Google’s DeepMind is attempting to use blockchain to layer privacy and security in electronic health care records.

Smart contracts: This application is still in the early stages, Catalini says, but by recording information on a blockchain, contracts could use that information to make themselves self-executing if certain conditions are met. This idea backfired last year when code was exploited to steal $60 million from The DAO, a blockchain-based venture capital firm.

Provenance and ownership: A blockchain could be used to record details about physical products, helping to verify authenticity and prevent fraud and counterfeiting. London-based EverLedger is tracking diamonds and envisions doing the same for fine wines. At the same time, for all these applications, a blockchain is only as useful as the quality of the information recorded on it in the first place.

Internet of things, robotics, and artificial intelligence: Your appliances are already talking to each other — think smart home technologies like Nest thermostats and security systems. What if they could barter or acquire resources? What if a highway could verify the identity of and accept payment from a self-driving car, opening up a pay-per-use fast lane to commuters in a rush? At the outer edge of application, but not outside the realm of possibility, Catalini says.

When will this disruption happen?

Over a period of more than ten years. Catalini is convinced blockchain has internet-level disruption potential, but like the internet it will come over a multi-decade timeline with fits and starts, and occasional setbacks. Some industries, especially finance, will see drastic change soon. Others will take longer.

“A lot of the work in this space is experimental,” Catalini says. “We are at the infrastructure building stage. Bitcoin has a market capitalization of $42 billion, which is nothing compared to the mainstream financial platforms and exchanges that move trillions of dollars every day.

But the technology is maturing and growing. At some point, one of the startups in this space may reveal itself to be the Netscape of cryptocurrencies. What would follow is something we have seen play out many times before in history.”

View at the original source

Thursday, June 22, 2017

Presidential Elections 2017


PRESIDENTIAL ELECTION INDIA

We have two excellent candidates this time for the position of the President of India. Either of them will make a very good President.

We already know, who is going to win.
...
Even promotion and campaigning is not needed as the members forming the voters in the electoral college will anyway blindly vote as per their party instructions. I really doubt if these voters know anything about the candidates credibility and credentials other than their being Dalits.

Both the political parties, the Congress and the BJP are projecting their being Dalit as the only qualification for them, which is wrong.

One has been a great lawyer, member of the parliament and a governor.
The other has been a successful career diplomat, a member of the parliament, minister and speaker of the Lok Sabha.

Their cast, which is just incidental, is being promoted and projected as their main virtue. When Pranab Mukerjee was selected as presidential candidate, no one said he is a Brahmin.
Being a 'Chatur Brahmin', it seems a liability to be hidden.

By selecting a Dalit as a candidate, both the political parties are giving an impression as if they giving alms to beggars (if not throwing crumbs.)

Dalits are not beggars.....

In case they are so worried about the Dalits,

Why have they not made Dalit as a Prime Minister????

Are there no Dalits in the elected MPs in both parties, who are capable of handling this position???
The concern of both parties for Dalits is both opportunistic and self serving and it is for display only.

It is time people of India create third political dispensation that can treat Dalits as normal human beings and create situation, where the Brahmins do not have to hide their caste, And Dalits do not have to display their caste.

or a political thought process that doesn't promote caste labels.

CASTE PROFILING IS THE WORST KIND OF CORRUPTION ANY POLITICAL PARTY CAN INDULGE IN....

HOW CAN CALL THIS PARTY CORRUPTION FREE...

Quite naturally, your views are required to keep this discussion inflamed.


Wednesday, June 21, 2017

China in quantum breakthrough as 'unhackable' experimental satellite sends first message. 06-22



  • Scientists in China used 'quantum satellite' to send entangled photons 1,200 km
  • The satellite produces entangled photon pairs which form an encryption key
  • These photons will theoretically remain linked over great distances
  • This means that any attempts to listen in will be detected on the other side. 
In a major breakthrough for quantum teleportation, scientists in China have successfully transmitted entangled photons farther than ever before, achieving a distance of more than 1,200 km (745 miles) between suborbital space and Earth.

Entangled photons theoretically maintain their link across any distance, and have potential to revolutionize secure communications – but, scientists have previously only managed to maintain the bond for about 100 km (62 miles).

Using the ‘quantum satellite’ Micius, the scientists were able to communicate with three ground stations in China, each more than 1,000 km (621 miles) apart. 


In quantum physics, entangled particles remain connected so that actions performed by one affects the behaviour of the other, even if they are separated by huge distances. This is illustrated in the artist's impression above.

The 1,300 pound craft satellite is equipped with a laser beam, which the scientists subjected to a beam splitter.

This gave the beam two distinct polarized states.

One of these beams was then used to transmit entangled particles, and the other used to receive the photons. 

Pairs of entangled photons fired to ground stations can then form a ‘secret key.’
Theoretically, any attempts to breach this type of communication would be easily detectable. 

The satellite launched from Jiuquan Satellite launch Center last year, and the new findings mark a promising step forward in the two-year mission prove successful, which could be followed by a fleet of others if all goes well, according to Nature.
To overcome the complications of long-distance quantum entanglement, scientists often break the line of transmission up, creating smaller segments that can then repeatedly swap, purify, and store the information along the optical fiber, according to the American Association for the Advancement of Science. 

The researchers sought to prove that particles can remain entangled across great distances – in this case, nearly 750 miles.

Earlier efforts to demonstrate quantum communication have shown this can be done up to just over 180 miles, and scientists hope that transmitting the photons through space will push this even farther.

When travelling through air and optical fibres, protons get scattered or absorbed, Nature explains, posing challenges to the preservation of the fragile quantum state.
But, photons can travel more smoothly through space.

Achieving quantum communication at such distances would enable the creation of secure worldwide communications networks, allowing two parties to communicate using a shared encryption key.

In quantum physics, entangled particles remain connected so that actions performed by one affects the behaviour of the other, even if they are separated by huge distances. 

So, if someone were to attempt to listen in on one end, the disruption would be detectable on the other. 

Over the course of the two-year mission, the researchers in China will conduct a Bell test to prove the existence of entanglement at such a great distance.

And, they will attempt to ‘teleport’ quantum states, according to Nature, meaning the quantum state of the photo will be rebuilt in a new location.

Researchers from Canada, Japan, Italy, and Singapore have also revealed plans to conduct quantum experiments in space, including one proposed aboard the International Space Station.

This experiment would attempt to create a reliable and efficient means for teleportation.

By achieving quantum teleportation, the researchers say they could create a telescope with an enormous resolution.

‘You could not just see planets,’ Paul Kwiat, a physicist at the University of Illinois at Urbana–Champaign involved with the Nasa project, ’but in principle read licence plates on Jupiter’s moons.’   

This working heart tissue is made from spinach 06-21




Researchers from the Worcester Polytechnic Institute (WPI) have transformed a spinach leaf into functional heart tissue. The team’s goal was to recreate human organ tissue down to the fragile vascular networks of blood vessels it can’t survive without. Scientists had previously attempted to 3D print intricate vascular networks without success. This breakthrough could mean that the delicate vascular systems of plants are the key.


To create the heart tissue, the scientists at WPI revealed the leaf’s cellulose frame by stripping away the plant cells. Then, they “seeded” the frame with human cells, causing tissue growth on the frame. Finally, they were able to pump microbeads and fluids through the veins to illustrate the functioning concept.

Repairing Damage, Creating Replacements


Although other scientists have been able to create small-scale artificial samples of human tissue, those samples required integration with existing blood vessels. The large-scale creation of working tissue infused with the vascular vessels critical to tissue health had proven impossible.


Because the technique could help people grow layers of stronger, healthier heart muscle, the team suggests that it could eventually be used to treat heart attack patients or others whose hearts have difficulty contracting. The researchers have also experimented with parsley, peanut hairy roots, and sweet wormwood as they believe the technique could make use of different kinds of plants to repair other types of tissues. For example, wood cellulose frames could one day help us repair human bones.


“We have a lot more work to do, but so far this is very promising,” Glenn Gaudette, a professor of biomedical engineering at WPI, told The Telegraph. “Adapting abundant plants that farmers have been cultivating for thousands of years for use in tissue engineering could solve a host of problems limiting the field.”



How HIV-1 puts itself to sleep 06-21

Read about the antisense ASP RNA acting as viral latency factor.


Image credit : Shyam's Imagination Library

Upon infection of a new cell, the HIV-1 genome integrates into the genome of the host cell, and in this form HIV-1 is known as a provirus. Under proper cellular conditions, the HIV-1 provirus produces the transactivator Tat that drives efficient expression of the viral genome, leading to the production of new viral particles.

Alternatively, the provirus remains silent in a status known as latency. In our study, we demonstrated that HIV-1 encodes an antisense transcript (ASP) that recruits the cellular Polycomb Repressor Complex 2 (PRC2) to the proviral 5’LTR. PRC2 promotes nucleosome assembly at the 5’LTR, leading to transcription silencing and proviral latency.

While active regulation of proviral expression by Tat has long been known, latency was thought to be a passive event caused primarily by the absence of key cellular and viral transcription factors. Our study demonstrated that – on the contrary – HIV-1 also regulates the establishment and maintenance of latency through ASP, and therefore it controls all aspects of its destiny.

The impetus for this study happened – as often is the case – very serendipitously. Our lab became interested in the presence of antisense transcription in human retroviruses, HTLV-1 and HIV-1 – a research area that was relatively unexplored. There was some evidence in the literature that these antisense transcripts play a role in viral expression, but the mechanism was yet to be described. During an informal discussion, a friend and colleague – Dr. Rosa Bernardi – brought to our attention that many cellular antisense transcripts suppress the expression of their cognate sense transcript by tethering chromatin modifying protein complexes to their promoter regions, and by inducing nucleosome formation and transcriptional silencing.

This inspired us to use RNA immunoprecipitation (RIP) assays to test whether the HIV-1 antisense RNA (ASP) interacts with members of the PRC2 complex. However, our initial efforts were repeatedly unsuccessful. Discouraged by these negative results, we decided to focus on other projects in the lab. After a few months we revisited these experiments, and we realized that there was a problem in the design of the RT-PCR portion of the RIP assay. After making the necessary modifications to the RT-PCR assay, we were finally able to demonstrate specific interaction between the ASP RNA and two components of the PRC2 complex. This important result encouraged us to further pursue this line of studies.

The “eureka” moment came shortly after that when functional studies showed that over-expression of the ASP RNA in vivo suppresses acute viral replication and promotes the establishment and maintenance of latency.

Our current efforts are focused on defining the structural and functional determinants of the ASP RNA. Since this transcript contains an open reading frame, we are also investigating the expression and function of the ASP protein.


Figure legend

The HIV-1 ASP RNA acts as a viral latency gene: it interacts with the cellular Polycomb Repressor Complex 2 (PRC2), and recruits it to the HIV-1 5’LTR. There, PRC2 catalyzes trimethylation (Me3) of lysine 27 (K27) on histone H3. The deposition of this repressing epigenetic mark leads to the assembly of the nucleosome Nuc-1, turning off transcription from the HIV-1 5’LTR, and promoting viral latency.

View at the original source

Wednesday, June 14, 2017

Mastering the Art of Communication: What Big Data Can Tell Us 06-15





Image credit: Shyam's Imagination Library

There’s plenty of anecdotal evidence about what makes a good communicator, but Noah Zandan is
more interested in the science behind it. That’s why he co-founded Quantified Communications, a firm that helps business leaders remake and refine their messages.

Zandan spoke recently to Cade Massey, Wharton practice professor of operations, information and decisions and co-director of the Wharton People Analytics Initiative, about how he applies research to the art of communication. Massey is co-host of the Wharton Moneyball show on Wharton Business Radio on SiriusXM channel 111, and this interview was part of a special broadcast on SiriusXM for the Wharton People Analytics Conference.

An edited transcript of the conversation appears below.

Cade Massey: Let’s understand what Quantified Communications is and how you got going in that direction.

Noah Zandan: The idea behind it is that communications has always been considered an art. How people talk to each other, how executives communicate, how we relate to other people, how we connect to the world around us, has always kind of been this art. Academics have been studying it for years, which is really exciting, and what we are trying to do at Quantified Communications is bring some of that research and apply it to a business environment. We work with corporations and organizations to really help their leadership, help the people moving the message of the business to deliver that message, and do it in a way where they are using objective data to know whether or not it works.

Massey: What is your background?

Zandan: I studied economics in college. Econometrics. I showed up on Wall Street, bright-eyed, and realized pretty quickly as I got further and further into Wall Street that we were modeling everything — obviously looking at risk and trying to make $1 billion decisions — off of data. But there was a missing factor from our model, and that was the people: The way that the executives communicate, the way they told the story, how confident they were was really one of the critical success factors on Wall Street. But there was no data behind it, and I’m an econ guy. [I thought,] “This isn’t rational.” I started looking and found some amazing research. Folks like James Pennebaker at the University of Texas, people who have been measuring this stuff for years, but nobody in the business environment knew this existed.

“We thought visionaries would really be complex thinkers, but in fact what they’re really concerned with is making things simple and breaking it down into steps.”

And so from there, we started. Our co-founder [Peter Zandan] has a Ph.D. in evaluation research and started finding all of this great stuff and then built a big database and a big platform to measure it. All of the big presidential speeches, all of the TED talks, media interviews — you name it, we’ve tried to go find it.

Massey: What are you doing with it?

Zandan: Well first, you have to be able to process it. So you’ve got to tag it; you’ve got to organize it; you’ve got to make sure that it’s useful. The New York Times calls it being a data janitor. It is a huge part of the job for a data scientist. We spent a long time doing that, and then we had to go understand it. Was it successful or not? Did it accomplish its purpose? Did the audience react to it in the appropriate way? Go out and ask a bunch of people what they think. Do you trust this person if they did this? Do you believe them? Do you want to engage with them more? And then measure the factors of the communication. What types of words did they use? Were they making eye contact? What were they doing with their hands? Then you can understand the factors that correlate with success.

Massey: How did you decide what factors to look for?

Zandan: Again, academic research. Folks in academics have been doing this for years. One of the best guys out is Albert Mehrabian out of UCLA. He created this model called the Three Vs — verbal, voice and visual. It breaks down someone’s communication into some of the important elements, and he did a bunch of research as to how those are correlated with whether or not I like you. You go talk to communications folks and researchers, and they understand eye contact, facial movement, features.

There are factors behind all this stuff.

Massey: As you said, it’s historically been an art. What is the disparity between what you’re bringing to this conversation versus what’s been in the conversation before? When you come to these academics with this unbelievable database and say, “I’ve run some tests of these ideas,” are they saying, “This is different than anything we’ve seen before?” Or is this just a bigger version of what they’ve done?

Zandan: I would probably say it’s just a different use case. The academics are doing it from a great research standard, really thinking about how to apply it for research validation. What we’re doing is trying to bring it in a more applied way — looking at how leaders can communicate, really thinking carefully about what their purposes and audience types are. And then we can also go a little bit further, in that we can build predictive models and just run them over and over, given that we’re a business and not held to kind of the research standards.

Massey: One question you’ve looked at is, what do visionary communicators or visionary leaders do? Can you give us a recap of your findings?

“If you think about Elon Musk talking about Tesla, he always talks about what it’s like to drive in the car, what it’s like to look at the car, how the doors work.”

Zandan: We looked at hundreds of transcripts of visionary leaders. It was just a linguistic analysis. We didn’t look at their faces or voices or things like that. What we identified was what separates these people who we consider to be visionaries, everybody from Amelia Earhardt to FDR to Elon Musk to TED Talks on innovation. What separates them from the average communicator? What distinguishes them from a factor model perspective?

There were three main findings that we had. One: We thought visionaries would talk a lot about the future, but in fact they talked about the present. Two: We thought visionaries would really be complex thinkers, but in fact what they’re really concerned with is making things simple and breaking it down into steps. Three: We thought that visionaries would be really concerned with their own vision, but in fact they’re more concerned with getting their vision into the minds of their audience.

Massey: What does that mean?

Zandan: That means using second-person pronouns and using a lot of perceptual language, talking about look, touch and feel. It really brings the audience into the experience with you. So if you think about Elon Musk talking about Tesla, he always talks about what it’s like to drive in the car, what it’s like to look at the car, how the doors work. It’s really less about the future of energy and transport. As this kind of theoretical vehicle, he really brings it and makes it tangible.

Massey: One thing that jumps out to me about that research is the present tense versus the future, especially when you’re talking about visionary leaders. You would have expected that to go the other way. Why do you think they are so much more effective?

Zandan: We saw it highly correlated with credibility. I think that people think if you’re talking so much about the future, then it’s going to be less credible. People aren’t going to believe you as much. So, you really want to [apply it to] today.

“The data can lead you down a path of replication. We don’t want to do that, because so much of what you communicate is your personality.”

Massey: How do you apply this research for your clients?

Zandan: What we often get asked to do is help people improve their communications, use the technology, use the analytics, allow them to make data-driven decisions on how to better impact their audiences. The No. 1 question Oprah Winfrey gets when the lights go off after her interviews with all of these amazing world leaders and celebrities is, “How did I do?” That’s what these people want to know. We can answer that not in a way that their team is going to — which is, “Hey, boss, you did great.” We can actually give them a lot of truth in the data — talk about how they are perceived, talk about how they can get better, and give them a very prescriptive plan to better impact their audiences and achieve their purposes.

Massey: When you work with people in that role, what data do you collect?

Zandan: We look at text, audio or video, which we can take in. We’ll break those down into the elements. So text is what you say, the words. For audio, we’ll look at the words as well as your voice. And then for video, which is our favorite, you’ve got the face and the gestures. You break down all of those into different behavioral patterns, you measure all of them, you benchmark them against what they would consider to be a measure of success. That could be themselves, that could be someone who is best in class, that could be a competitor they aspire to be. And then you could give them a road map for how to achieve that. We’ll give them some guidance on that, but a lot of times they know. The White House came to us and said, “We want to replicate one of Obama’s best speeches. We know which one was our favorite, and we want to understand the different factors behind that.”
Massey: Can you speak about what you found?

Zandan: No. But the speech was a eulogy in Arizona, which they considered to be one of the best ones he has given during his tenure.

Massey: Let’s put it this way, did you find anything interesting when you looked at that kind of speech from that level? That’s really championship-level rhetoric.

Zandan: Of course. You uncover stuff, but what’s worth saying here is that there is also the other side of the equation, which is authenticity. I am not President Obama. I do not speak like President Obama. If I did, it would seem very strange to an audience. Everybody has their authentic tone. We work really hard to measure authenticity. It’s one of the hardest problems.

Massey: Being able to do something like that would be a real advance.

Zandan: It would. And there is obviously authenticity to the way you deliver the message, and there are words that are considered authentic. But what we’re careful on is we don’t want to push people to be something that they’re not. The data can lead you down a path of replication. We don’t want to do that, because so much of what you communicate is your personality.

View at the original source

Choose staff wisely when planning a digital transformation 06-13





Plenty of large businesses are, justifiably, embracing innovation of all kinds. But, cautions HPE's Craig Partridge, consider whether IT staff from old-school backgrounds (and their "think conservatively" cultural values) are the right people for a successful digital transition.

Every business wants to enhance what it does to make its products more valuable to customers (and thus more profitable to the company) and work more efficiently (that is, save money). So just about every enterprise organization is motivated to augment or create a digital strategy.

It’s one thing for a business to say, “Let’s exploit new technologies to gain competitive advantage.” Reaching that goal—or at least avoiding being left behind—takes a strategic plan, a dose of shiny new technology, and most important, attention to the human beings who create and implement the plan

In a Hewlett Packard Enterprise Discover presentation, “Thriving in the Age of Digital Disruption,” HPE’s Craig Partridge, worldwide director of data center platforms consulting, shared real-life lessons of digital transformation based on customer use cases and successful projects. In the one-hour, high-speed session, Partridge detailed a blueprint highlighting the elements needed for success.

And regardless of the many technologies and business processes that may be involved, there’s one key lesson to take away from the exercise: Choose the right people for the job, and value your staff for their diverse abilities. Doing so creates tension, Partridge said. But that isn’t a bad thing.

Digital disruption is about data

Disruption might take the form of a car manufacturer that wants to build out a connected car. It may be a bank aiming to give customers a good mobile digital experience. Perhaps it’s a sports stadium that recognizes that attending a game now includes mobility and Wi-Fi, not just a hot dog. Or the Rio airport, which during the Olympics had to digitize its services to accommodate an extra 2 million passengers.

Most of these projects are powered by emerging technologies like the Internet of Things, cloud, machine learning, and data analytics.

Technologically speaking, the “edge” is about data: how you collect it, how you analyze it, and how you use it for competitive advantage. Each of us generates a huge amount of unstructured data, especially with our mobile devices. Nowadays, the "machine edge" (smart sensors and machine-to-machine communication) is adding even more data. “Going forward, I see people combining those two data sets to create a good experience,” Partridge said.

In the past, cloud computing discussions have focused on core-out issues: What should IT move out of the data center? Today, the conversation is about what data to bring in and how best to do so. That encourages a different viewpoint. “Hybrid IT is what powers that new experience at the edge,” Partridge said. And IT has to change the operating model to work in that new way.  

As organizations put together software-defined agendas to accelerate how and where they deliver services, the first step is recognizing that not every traditional business application needs to be changed or disrupted. Some big transactional systems don't need to be mobile. Other systems need to be bulldozed and replaced.

The drive to improve digital experiences is also forcing organizations to work with partners in the value chain (especially with API-based tools). It means adopting concepts like continuous integration and the DevOps agenda, cloud management tool sets, and open cloud stacks, all with quick feedback and quick iteration. This kind of thinking does not come naturally to many large IT shops.
Yet “new” often translates into “We haven’t figured this out yet.” (If it were otherwise, it wouldn’t be much of a disruption, right?) HPE has created blueprints for the business process to help organizations succeed—after all, you’d rather learn from others’ mistakes than your own, right?

Foster the people

“The No. 1 reason projects succeed or fail is people,” said Partridge, echoing sentiments long understood by developers and IT professionals, if not their managers. People processes, politics, and governance have a huge effect on project outcomes, even when you don’t think you are dealing with a so-called peopleware problem.

“Brokering the supply chain sounds like a technical issue,” Partridge noted. “What people miss is that it requires an organization shift.” A business’s CIO now has to place demand appropriately across the supply chain, which sometimes is in other parts of the organization.

Less obvious to many enterprise development teams are cultural issues. They spent years creating an organization based on repeatable processes and infrastructure, such as reliability, approval-based plans, and a waterfall development model that’s measured in months.

That predictability and resilience are strengths. “These are big deals to IT,” Partridge said. “We can’t lose that DNA. These systems of record need to maintain that integrity.”

But the new systems that are part of the digital disruption move a lot faster. Innovation-optimized projects emphasize flexibility, working on small teams that are business-centric and close to the customer, with short-term goals and a willingness to embrace uncertainty. “That technical documentation is six months old, so it’s out of date,” one DevOps consultant said to me during the conference, just in passing.

The development process for imagining disruption requires a different mind-set. Central IT pros can generally learn new tech, but learning new values and mind-sets can be much more challenging. “We can be retrained, but we have habits ingrained from years of work,” Partridge said.
For example, when the automobile manufacturer launched its digital transformation project, it initially staffed the team from its central IT department, whose "cadence didn't lend itself to rapid iterative development,” Partridge said.

The company ended up starting over with a new IT group that operated in parallel with the existing central IT team. Although that might seem like a recipe for bickering and dysfunction, Partridge characterized the relationship as one of “creative tension,” because the friction led both teams to come up with ideas that helped one another. 

Digital transformation: Lessons for leaders

  • “New” often translates into “We haven’t figured this out yet.”
  • No matter how brilliant the idea is, success depends on putting the right personnel in place and supporting them properly. 
  • Value existing systems, and recognize what doesn’t benefit from changing. 

Tuesday, June 13, 2017

A Favorite Subject Returns to Schools: Recess. 06-14



After playtime was dropped amid focus on academic performance, educators now take playground breaks seriously




Kindergarten students take to the playground at Oak Point Elementary, in Oak Point, Texas, where recess went from 30 minutes a day to one hour a day. Photo: Brandon Thibodeaux for The Wall Street Journal Three kindergarten girls looked close to taking a spill as they sat on the high back of a bench on a playground at Oak Point Elementary. Feet away, several administrators looked on, not making a move to stop them because at this school outside of Dallas, playtime is revered.

“As long as they’re safe, we allow kids to be kids,” said Daniel Gallagher, assistant superintendent for educational services in the Little Elm Independent School District.

That’s the mantra in this small school district, where schoolchildren are transitioning from one daily 30-minute recess to one hour a day, taken in four 15-minute increments. School officials say children are better focused with more unstructured breaks and do better in school.

School districts throughout the country are reassessing recess—with some bringing back the pastime or expanding it, citing academic and health benefits.

On Tuesday, the Minneapolis school board is expected to consider moving from a recommended 20 minutes of daily recess to a required 30 minutes daily. And in Florida, parents are hoping the governor will soon sign an education bill that includes a required 20 minutes of daily recess for elementary-school students in traditional public schools.

In the past year, the state of Rhode Island and school districts in Dallas, Portland, the Jefferson Parish Public School System in Louisiana, and Orange County and Manatee County school districts in Florida, are among those to implement a daily-recess requirement.About 21% of school districts required recess daily for elementary-school students in the 2013-2014 school year, according to the latest study from the Centers for Disease Control and Prevention and Bridging the Gap Research Program. That’s an increase from 16% of school districts with the requirement in 2006-2007.
It’s a change after years of recess taking a back seat to testable core subjects like math and reading, with a noticeable decline in playtime after the rollout of the now-defunct 2002 No Child Left Behind education law that put more focus on holding schools accountable for academic performance.

The Center on Education Policy, a national research group, found in a 2007 report that 58% of school districts increased time spent teaching English language arts, while 45% increased math time, after the 2002 education law. Meanwhile, 20% of school districts decreased the amount of time spent on recess, at an average of 50-minutes less a week. (The CDC recommends at least 20 minutes of daily recess for elementary-school students.)

Supporters of daily recess often point to a 2013 study by the American Academy of Pediatrics, which says in part that “recess serves as a necessary break from the rigors of concentrated, academic challenges in the classroom.” The study also found that “safe and well-supervised recess offers cognitive, social, emotional, and physical benefits that may not be fully appreciated when a decision is made to diminish it.”

“Recess resets their brain,” said Lowell Strike, superintendent in the Little Elm district, where children have recess as long as the wind chill is at least 13 degrees and the heat index is no higher than 103.

But there has been some pushback. Some school administrators and lawmakers have spoken against state bills to mandate recess, saying it takes away flexibility from schools. This year, the Arizona School Boards Association opposed a bill in the state that would have required 50 minutes of daily recess in elementary schools.

“We are absolutely not against school recess,” said Chris Kotterman, the association’s director of governmental relations. “But when it comes to how the school day should be structured, it should be left up to the local school board. We generally try to keep state policy mandates to a minimum.”
Parents in areas around the country are advocating for daily recess.

Angela Browning is among “recess moms” in Florida pushing for a statewide recess mandate. She said the group has successfully pushed for daily recess in a few Florida school districts, including Orange County Public Schools, where her three children attend school. Ms. Browning said she got active several years ago upon finding out from her children that their school didn’t offer daily recess.

“I was stunned,” she said. “Children learn on the playground—leadership skills, social skills, negotiating skills. With all the testing, recess, along the way, got squeezed out.”

Orange County Public Schools started requiring 20 minutes of recess daily for students in kindergarten through fifth grade in the 2016-17 school year.

Recess requirements are usually decided at the campus level, and to a lesser extent at the district level. Studies have found that a majority of schools offer some type of recess, but not always regularly nor with set timespans—and sometimes in conjunction with school lunch. Those who linger over lunch get less playtime.

The CDC advises against taking recess in conjunction with lunch breaks and physical-education classes, saying that it should be unstructured and on a regular schedule.

In Little Elm, teacher Nicole Beal said she has seen firsthand the benefits of her kindergarten students having recess breaks during the school day.

“Their reading is better, they’re more focused,” she said. “Getting outside, it’s a nice break.”
When the children were asked who likes the extra playtime, every hand shot up.

View at the original source

Brain Architecture: Scientists Discover 11 Dimensional Structures That Could Help Us Understand How the Brain Works 06-14




Scientists studying the brain have discovered that the organ operates on up to 11 different dimensions, creating multiverse-like structures that are “a world we had never imagined.”

By using an advanced mathematical system, researchers were able to uncover architectural structures that appears when the brain has to process information, before they disintegrate into nothing.
Their findings, published in the journal Frontiers in Computational Neuroscience, reveals the hugely complicated processes involved in the creation of neural structures, potentially helping explain why the brain is so difficult to understand and tying together its structure with its function.

The team, led by scientists at the EPFL, Switzerland, were carrying out research as part of the Blue Brain Project—an initiative to create a biologically detailed reconstruction of the human brain. Working initially on rodent brains, the team used supercomputer simulations to study the complex interactions within different regions.

In the latest study, researchers honed in on the neural network structures within the brain using algebraic topology—a system used to describe networks with constantly changing spaces and structures. This is the first time this branch of math has been applied to neuroscience.

"Algebraic topology is like a telescope and microscope at the same time. It can zoom into networks to find hidden structures—the trees in the forest—and see the empty spaces—the clearings—all at the same time," study author Kathryn Hess said in a statement.

In the study, researchers carried out multiple tests on virtual brain tissue to find brain structures that would never appear just by chance. They then carried out the same experiments on real brain tissue to confirm their virtual findings.

They discovered that when they presented the virtual tissue with stimulus, groups of neurons form a clique. Each neuron connects to every other neuron in a very specific way to produce a precise geometric object. The more neurons in a clique, the higher the dimensions.

In some cases, researchers discovered cliques with up to 11 different dimensions.

The structures assembled formed enclosures for high-dimensional holes that the team have dubbed cavities. Once the brain has processed the information, the clique and cavity disappears.

The left shows a digital copy of a part of the neocortex, the most evolved part of the brain. On the right is a representation of the structures with different dimensions. The black hole in the middle symbolizes a complex of multi-dimensional spaces, or cavities.

"The appearance of high-dimensional cavities when the brain is processing information means that the neurons in the network react to stimuli in an extremely organized manner," said one of the researchers, Ran Levi.

"It is as if the brain reacts to a stimulus by building then razing a tower of multi-dimensional blocks, starting with rods (1D), then planks (2D), then cubes (3D), and then more complex geometries with 4D, 5D, etc. The progression of activity through the brain resembles a multi-dimensional sandcastle that materializes out of the sand and then disintegrates," he said.

Henry Markram, director of Blue Brain Project, said the findings could help explain why the brain is so hard to understand. "The mathematics usually applied to study networks cannot detect the high-dimensional structures and spaces that we now see clearly,” he said.

"We found a world that we had never imagined. There are tens of millions of these objects even in a small speck of the brain, up through seven dimensions. In some networks, we even found structures with up to eleven dimensions."

The findings indicate the brain processes stimuli by creating these complex cliques and cavities, so the next step will be to find out whether or not our ability to perform complicated tasks requires the creation of these multi-dimensional structures.

In an email interview with Newsweek , Hess says the discovery brings us closer to understanding “one of the fundamental mysteries of neuroscience: the link between the structure of the brain and how it processes information.”

By using algebraic topology, she says, the team was able to discover “the highly organized structure hidden in the seemingly chaotic firing patterns of neurons, a structure which was invisible until we looked through this particular mathematical filter.”

Hess says the findings suggest that when we examine brain activity with low-dimensional representations, we only get a shadow of the real activity taking place. This means we can see some information, but not the full picture. “So, in a sense our discoveries may explain why it has been so hard to understand the relation between brain structure and function,” she explains.

“The stereotypical response pattern that we discovered indicates that the circuit always responds to stimuli by constructing a sequence of geometrical representations starting in low dimensions and adding progressively higher dimensions, until the build-up suddenly stops and then collapses: a mathematical signature for reactions to stimuli.

“In future work we intend to study the role of plasticity—the strengthening and weakening of connections in response to stimuli—with the tools of algebraic topology. Plasticity is fundamental to the mysterious process of learning, and we hope that we will be able to provide new insight into this phenomenon,” she added.



The waning days of Indian IT workers being paid to do nothing 06-13





The bench, the Indian IT industry’s resource bank, is thinning.

For long considered a key strength of India’s tech majors, the bench is losing its relevance even as just-in-time contract hiring is gaining popularity. More companies are hiring techies on relatively short, fixed-term contracts, rather than employing them full-time even when there are no projects.
Automation, creeping unionism, and a global closing of borders for techies have in recent times accelerated this process. So much so that the average IT company’s bench strength has progressively fallen from between 8% and 10% of the billable employees to between 4% and 5% now, human resources (HR) experts believe.

So what exactly is happening?

What is the bench?

In the IT industry, the bench refers to the section of a company’s employees that isn’t working on any project for the time being but remains on the rolls and receives regular salary.

“The best way to answer this question is with an analogy…In football or cricket there are only 11 players allowed on the pitch/ground. So there are 5/6 players out as subs ready to come on in case of injuries. These players usually sit (or at least used to) on a bench and hence the expression ‘Sitting on the bench,'” Bhavish Parkala, a developer with General Electric, posted on Quora.

It is, indeed, a bank of personnel. (Interestingly, the term “bank” itself comes from the Italian banchiere, the foreign exchange dealers of 14th century Italy. They were called so because they “did their business literally seated on ‘benches’ behind tables in the street,” Nial Fergusson writes in his book The Ascent of Money.)

This bank could consist of fresh graduates or senior techies. A person could spend anywhere between a couple of weeks to up to six months on the bench.

“At Infosys, we have a small percentage of employees on bench at all times…This is a planned period where the employee gets time to learn as well as focus on some internal initiatives,” Richard Lobo, EVP & head of HR at Infosys, told Quartz in an email.

This is largely an Indian phenomenon. After all, Indians tend to prefer secure full-time jobs over contract positions. In other places, people don’t hesitate to take up short-term projects, says Alka Dhingra, assistant general manager at staffing firm TeamLease Services. So, companies don’t have to hire full-time employees and then bench them when there are no projects.

The bench, like a hologram, looks different from every angle.

For IT firms, it is often an important factor their clients consider. A strong bench is an indication that the firm has ready resources and can begin execution immediately. But having too many people on the bench doesn’t reflect well either. It would mean employees are underutilised, and this would impact the profitability of the firm. Firms rarely speak about their bench size and are always working towards high utilisation rates.

As for employees, some say it is fun initially—you get paid to do nothing and have the opportunity to learn new skills, prepare for projects or for competitive exams. But fatigue sets in soon enough. “I spent around 16 months (on the) bench. Initially it was like getting paid to enjoy the life…It takes some time to figure out the company is not responsible for your growth,” Indira Raghavan, a techie, wrote on Quora last year. In recent times, with layoff fears lingering, not having a project to work on could be reason enough to lose your job.

Meanwhile, companies have been actively working at improving utilisation rates. “Earlier 30% of employees were on the bench and utilisation ratio was about 70% . This has now gone up to 80-81%,” Kris Lakshmikanth, the managing director at recruitment firm Head Hunters India, said.
Vishal Sikka, CEO & MD of Infosys, India’s second-largest IT services company, said last year that he was pushing automation to reduce bench strength. “Despite being here (at Infosys) for 18 months, I can’t still find an answer around the idea of a bench,” Sikka had told the Business Standard newspaper (paywall). Zero Bench is an initiative Infosys launched in 2015 to help employees find short-term assignments. Under this, employees who have tasks to perform can post their requirements based on their projects and benched employees can sign up to help finish the task. According to data from Infosys’s annual reports, the company’s utilisation rate (which refers to the part of the workforce actively working on projects and not on the bench) for the financial year 2012 (pdf) was 76.6% and this went up to 81.7% in the financial year 2017 (pdf).

Infosys isn’t alone. “All these (IT) companies have a resource management team which links bench people and internal teams to different projects and mobilises the bench people to different projects,” Dhingra of Teamlease said.

The beginning of the end

Now, IT companies are increasingly seeking “just-in-time” employees on contract, and industry experts see this as a trend that will replace the bench.

The idea behind having a bench was to ensure that employees are available to start working on projects as soon as the customer assigns a task to the IT firm. Instead, now they are seeking techies who can come on board in quick time only for specific projects, after which they either move on to other jobs, join the same company’s next project or, at times, get absorbed into the company as a full-time employee.

HR experts believe contract employees are a better alternative to the bench. They are as effective in terms of deployment, they help cuts down costs, the company can pick professionals with better skills, and, finally, helps the companies avoid mass layoffs and subsequent protests.

In their latest annual reports, IT firms Wipro and Cognizant say that “profitability could suffer” if “favorable utilisation rates” are not maintained.

All IT companies are under cost pressures now,” Lakshmikanth says. So, hiring techies only when they’re needed makes sense. Most companies pay contract workers more than they would other full-time employees—about 20% more, according to Lakshmikanth—but this would still economical compared to having a large bench.

“Contingent hiring would be the way forward across the industry…Typically, contract employees will be the virtual just-in-time bench,” said Thammaiah BN, managing director of recruitment services company Kelly Services India.

While Indian techies still prefer full-time jobs, acceptance for contract-based employment is rising. It is seen as an opportunity to first get one’s foot in and then get absorbed into a company. Then there’s the advantage of getting good brand names on one’s resumes, learning new technologies, and gaining knowledge and experience, Dhingra pointed out.

Besides, it is good money. IT firms pay contract employees more than the regular employees. So, for someone who is benched, between jobs, or just out of college, contract work is a good deal. Of course, there are drawbacks, too. “Recession? Layoff? Contract employees are the worst hit,” says Shashank CG, a software engineer. 

View at the original source

Artificial intelligence: here’s what you need to know to understand how machines learn 06-13


From Jeopardy winners and Go masters to infamous advertising-related racial profiling, it would seem we have entered an era in which artificial intelligence developments are rapidly accelerating. But a fully sentient being whose electronic “brain” can fully engage in complex cognitive tasks using fair moral judgement remains, for now, beyond our capabilities.

Unfortunately, current developments are generating a general fear of what artificial intelligence could become in the future. Its representation in recent pop culture shows how cautious – and pessimistic – we are about the technology. The problem with fear is that it can be crippling and, at times, promote ignorance.

Learning the inner workings of artificial intelligence is an antidote to these worries. And this knowledge can facilitate both responsible and carefree engagement.

The core foundation of artificial intelligence is rooted in machine learning, which is an elegant and widely accessible tool. But to understand what machine learning means, we first need to examine how the pros of its potential absolutely outweigh its cons.

Data are the key

Simply put, machine learning refers to teaching computers how to analyse data for solving particular tasks through algorithms. For handwriting recognition, for example, classification algorithms are used to differentiate letters based on someone’s handwriting. Housing data sets, on the other hand, use regression algorithms to estimate in a quantifiable way the selling price of a given property.



Machine learning, then, comes down to data. Almost every enterprise generates data in one way or another: think market research, social media, school surveys, automated systems. Machine learning applications try to find hidden patterns and correlations in the chaos of large data sets to develop models that can predict behaviour.

Data have two key elements – samples and features. The former represents individual elements in a group; the latter amounts to characteristics shared by them.

Look at social media as an example: users are samples and their usage can be translated as features. Facebook, for instance, employs different aspects of “liking” activity, which change from user to user, as important features for user-targeted advertising.

Facebook friends can also be used as samples, while their connections to other people act as features, establishing a network where information propagation can be studied.



Outside of social media, automated systems used in industrial processes as monitoring tools use time snapshots of the entire process as samples, and sensor measurements at a particular time as features. This allows the system to detect anomalies in the process in real time.

All these different solutions rely on feeding data to machines and teaching them to reach their own predictions once they have strategically assessed the given information. And this is machine learning.

Human intelligence as a starting point

Any data can be translated into these simple concepts and any machine-learning application, including artificial intelligence, uses these concepts as its building blocks.

Once data are understood, it’s time to decide what do to with this information. One of the most common and intuitive applications of machine learning is classification. The system learns how to put data into different groups based on a reference data set.

This is directly associated with the kinds of decisions we make every day, whether it’s grouping similar products (kitchen goods against beauty products, for instance), or choosing good films to watch based on previous experiences. While these two examples might seem completely disconnected, they rely on an essential assumption of classification: predictions defined as well-established categories.

When picking up a bottle of moisturiser, for example, we use a particular list of features (the shape of the container, for instance, or the smell of the product) to predict – accurately – that it’s a beauty product. A similar strategy is used for picking films by assessing a list of features (the director, for instance, or the actor) to predict whether a film is in one of two categories: good or bad.

By grasping the different relationships between features associated with a group of samples, we can predict whether a film may be worth watching or, better yet, we can create a program to do this for us.

But to be able to manipulate this information, we need to be a data science expert, a master of maths and statistics, with enough programming skills to make Alan Turing and Margaret Hamilton proud, right? Not quite.

We all know enough of our native language to get by in our daily lives, even if only a few of us can venture into linguistics and literature. Maths is similar; it’s around us all the time, so calculating change from buying something or measuring ingredients to follow a recipe is not a burden. In the same way, machine-learning mastery is not a requirement for its conscious and effective use.
Yes, there are extremely well-qualified and expert data scientists out there but, with little effort, anyone can learn its basics and improve the way they see and take advantage of information.

Algorithm your way through it

Going back to our classification algorithm, let’s think of one that mimics the way we make decisions. We are social beings, so how about social interactions? First impressions are important and we all have an internal model that evaluates in the first few minutes of meeting someone whether we like them or not.
Two outcomes are possible: a good or a bad impression. For every person, different characteristics (features) are taken into account (even if unconsciously) based on several encounters in the past (samples). These could be anything from tone of voice to extroversion and overall attitude to politeness.
For every new person we encounter, a model in our heads registers these inputs and establishes a prediction. We can break this modelling down to a set of inputs, weighted by their relevance to the final outcome.
For some people, attractiveness might be very important, whereas for others a good sense of humour or being a dog person says way more. Each person will develop her own model, which depends entirely on her experiences, or her data.
Different data result in different models being trained, with different outcomes. Our brain develops mechanisms that, while not entirely clear to us, establish how these factors will weight out.
What machine learning does is develop rigorous, mathematical ways for machines to calculate those outcomes, particularly in cases where we cannot easily handle the volume of data. Now more than ever, data are vast and everlasting. Having access to a tool that actively uses this data for practical problem solving, such as artificial intelligence, means everyone should and can explore and exploit this. We should do this not only so we can create useful applications, but also to put machine learning and artificial intelligence in a brighter and not so worrisome perspective.

There are several resources out there for machine learning although they do require some programming ability. Many popular languages tailored for machine learning are available, from basic tutorials to full courses. It takes nothing more than an afternoon to be able to start venturing into it with palpable results.

All this is not to say that the concept of machines with human-like minds should not concern us. But knowing more about how these minds might work will gives us the power to be agents of positive change in a way that can allow us to maintain control over artificial intelligence and not the other way around.

Monday, June 12, 2017

Figuring out superconductors 06-13


Physicists create anti ferromagnet that may help develop, monitor key materials.  


 
   

From the moment when physicists discovered superconductors — materials that conduct electricity without resistance at extremely low temperatures — they wondered whether they might be able to develop materials that exhibit the same properties at warmer temperatures.

The key to doing so, a group of Harvard scientists say, may lie in another exotic material known as an antiferromagnet.

Led by physics professor Markus Greiner, a team of physicists has taken a crucial step toward understanding those materials by creating a quantum antiferromagnet from an ultracold gas of hundreds of lithium atoms. The work is described in a May 25 paper published in the journal Nature.

“We have created a model system for real materials … and now, for the first time, we can study this model system in a regime where classical computers get to their limit,” Greiner said. “Now, we can poke and prod our antiferromagnet. It’s a beautifully tunable system, and we can even freeze time to take a snapshot of where the atoms are. That’s something you won’t be able to do with an actual solid.”

But what, exactly, is an antiferromagnet?

Traditional magnets, the kind that you can stick to your refrigerator, work because the electron spins in the material are aligned, allowing them to work in unison. In an antiferromagnet, however, those spins are arranged in a checkerboard pattern. One spin may be pointed north, while the next is pointing south, and so on.

Understanding antiferromagnets is important, Greiner and physics professor Eugene Demler said, because experimental work has suggested that, in the most promising high-temperature superconductors — a class of copper-containing compounds known as cuprates — the unusual state may be a precursor to high-temperature superconductivity.

Currently, Demler said, the best cuprates display superconductivity at about minus 160 degrees Fahrenheit, which is cold by everyday standards, but far higher than for any other type of superconductor. That temperature is also warm enough to allow practical applications of cuprate superconductors in telecommunications, transportation, and in the generation and transmission of electric power.

“This antiferromagnet stage is a crucial stepping-stone for understanding superconductors,” said Demler, who led the team providing theoretical support for the experiments. “Understanding the physics of these doped antiferromagnets may be the key to high-temperature superconductivity.”
To build one, Greiner and his team trapped a cloud of lithium atoms in a vacuum and then used a technique they dubbed “entropy redistribution” to cool them to just 10 billionths of a degree above absolute zero, which allowed them to observe the unusual physics of antiferromagnets.

“We have full control over every atom in our experiment,” said Daniel Greif, the postdoctoral fellow working in Greiner’s lab. “We use this control to implement a new cooling scheme, which allows us to reach the lowest temperatures so far in such systems.”

View at the original source