Help make a ubiquitous model of decision processes more accurate

Markov decision processes are mathematical models used to determine the best courses of action when both current circumstances and future consequences are uncertain. They’ve had a huge range of applications — in natural-resource management, manufacturing, operations management, robot control, finance, epidemiology, scientific-experiment design, and tennis strategy, just to name a few.

But analyses involving Markov decision processes (MDPs) usually make some simplifying assumptions. In an MDP, a given decision doesn’t always yield a predictable result; it could yield a range of possible results. And each of those results has a different “value,” meaning the chance that it will lead, ultimately, to a desirable outcome.

Characterizing the value of given decision requires collection of empirical data, which can be prohibitively time consuming, so analysts usually just make educated guesses. That means, however, that the MDP analysis doesn’t guarantee the best decision in all cases.

In the Proceedings of the Conference on Neural Information Processing Systems, published last month, researchers from MIT and Duke University took a step toward putting MDP analysis on more secure footing. They show that, by adopting a simple trick long known in statistics but little applied in machine learning, it’s possible to accurately characterize the value of a given decision while collecting much less empirical data than had previously seemed necessary.

In their paper, the researchers described a simple example in which the standard approach to characterizing probabilities would require the same decision to be performed almost 4 million times in order to yield a reliable value estimate.

With the researchers’ approach, it would need to be run 167,000 times. That’s still a big number — except, perhaps, in the context of a server farm processing millions of web clicks per second, where MDP analysis could help allocate computational resources. In other contexts, the work at least represents a big step in the right direction.

“People are not going to start using something that is so sample-intensive right now,” says Jason Pazis, a postdoc at the MIT Laboratory for Information and Decision Systems and first author on the new paper. “We’ve shown one way to bring the sample complexity down. And hopefully, it’s orthogonal to many other ways, so we can combine them.”

Unpredictable outcomes

In their paper, the researchers also report running simulations of a robot exploring its environment, in which their approach yielded consistently better results than the existing approach, even with more reasonable sample sizes — nine and 105. Pazis emphasizes, however, that the paper’s theoretical results bear only on the number of samples required to estimate values; they don’t prove anything about the relative performance of different algorithms at low sample sizes.

Pazis is joined on the paper by Jonathan How, the Richard Cockburn Maclaurin Professor of Aeronautics and Astronautics at MIT, and by Ronald Parr, a professor of computer science at Duke.

Although the possible outcomes of a decision may be described according to a probability distribution, the expected value of the decision is just the mean, or average, value of all outcomes. In the familiar bell curve of the so-called normal distribution, the mean defines the highest point of the bell.

The hottest subjects on campus

On an afternoon in early April, Tommi Jaakkola is pacing at the front of the vast auditorium that is 26-100. The chalkboards behind him are covered with equations. Jaakkola looks relaxed in a short-sleeved black shirt and jeans, and gestures to the board. “What is the answer here?” he asks the 500 MIT students before him. “If you answer, you get a chocolate. If nobody answers, I get one — because I knew the answer and you didn’t.” The room erupts in laugher.

With similar flair but a tighter focus on the first few rows of seats, Regina Barzilay had held the room the week prior. She paused often to ask: “Does this make sense?” If silence ensued, she warmly met the eyes of the students and reassured them: “It’s okay. It will come.” Barzilay acts as though she is teaching a small seminar rather than a stadium-sized class requiring four instructors, 15 teaching assistants, and, on occasion, an overflow room.

Welcome to “Introduction to Machine Learning,” a course in understanding how to give computers the ability to learn things without being explicitly programmed to do so. The popularity of 6.036, as it is also known, grew steadily after it was first offered, from 138 in 2013 to 302 students in 2016. This year 700 students registered for the course — so many that professors had to find ways to winnow the class down to about 500, a size that could fit in one of MIT’s largest lecture halls.

Jaakkola, the Thomas Siebel Professor in the Department of Electrical Engineering and Computer Science and the Institute for Data, Systems, and Society, and Barzilay, the Delta Electronics Professor of Electrical Engineering and Computer Science, have led 6.036 since its inception. They provide students from varied departments with the necessary tools to apply machine learning in the real world — and they do so, according to students, in a manner that is remarkably engaging.

Greg Young, an MIT senior and electrical engineering and computer science major, says the orchestration of the class, which is co-taught by Wojciech Matusik and Pablo Parrilo from the Department of Electrical Engineering and Computer Science (EECS), is impressive. This is all the more so because the trendiness of machine learning (and, consequently, the class enrollment), in his opinion, is nearly out of hand.

“I think people are going where they think the next big thing is,” Young says. Waving an arm to indicate the hundreds of students lined up in desks below him, he says: “The professors certainly do a good job keeping us engaged, considering the size of this class.”

Indeed, the popularity of 6.036 is such that a version for graduate students — 6.862 (Applied Machine Learning) — was folded into it last spring. These students take 6.036 and do an additional semester-long project that involves applying machine learning methods to a problem in their own research.

“Nowadays machine learning is used almost everywhere to make sense of data,” says faculty lead, Stefanie Jegelka, the X-Window Consortium Career Development Assistant Professor in EECS. She says her students come from MIT’s schools of engineering, architecture, science, management, and elsewhere. Only one-third of graduate students seeking to take the spinoff secured seats this semester.

How they learn

The success of 6.036, according to its faculty designers, has to do with its balanced delivery of theoretical content and programming experience — all in enough depth to prove challenging but graspable, and, above all, useful. “Our students want to learn to think like an applied machine-learning person,” says Jaakkola, who launched the pilot course with Barzilay. “We try to expose the material in a way that enables students with very minimal background to sort of get the gist of how things work and why they work.”

Once the domain of science fiction and movies, machine learning has become an integral part of our lived experience. From our expectations as consumers (think of those Netflix and Amazon recommendations), to how we interact with social media (those ads on Facebook are no accident), to how we acquire any kind of information (“Alexa, what is the Laplace transform?”), machine learning algorithms operate, in the simplest sense, by converting large collections of knowledge and information into predictions that are relevant to individual needs.

As a discipline, then, machine learning is the attempt to design and build computer programs that learn from experience for the purpose of prediction or control. In 6.036, students study principles and algorithms for turning training data into effective automated predictions. “The course provides an excellent survey of techniques,” says EECS graduate student Helen Zhou, a 6.036 teaching assistant. “It helps build a foundation for understanding what all those buzzwords in the tech industry mean.”

Analysis of laparoscopic procedures

Laparoscopy is a surgical technique in which a fiber-optic camera is inserted into a patient’s abdominal cavity to provide a video feed that guides the surgeon through a minimally invasive procedure.

Laparoscopic surgeries can take hours, and the video generated by the camera — the laparoscope — is often recorded. Those recordings contain a wealth of information that could be useful for training both medical providers and computer systems that would aid with surgery, but because reviewing them is so time consuming, they mostly sit idle.

Researchers at MIT and Massachusetts General Hospital hope to change that, with a new system that can efficiently search through hundreds of hours of video for events and visual features that correspond to a few training examples.

In work they presented at the International Conference on Robotics and Automation this month, the researchers trained their system to recognize different stages of an operation, such as biopsy, tissue removal, stapling, and wound cleansing.

But the system could be applied to any analytical question that doctors deem worthwhile. It could, for instance, be trained to predict when particular medical instruments — such as additional staple cartridges — should be prepared for the surgeon’s use, or it could sound an alert if a surgeon encounters rare, aberrant anatomy.

“Surgeons are thrilled by all the features that our work enables,” says Daniela Rus, an Andrew and Erna Viterbi Professor of Electrical Engineering and Computer Science and senior author on the paper. “They are thrilled to have the surgical tapes automatically segmented and indexed, because now those tapes can be used for training. If we want to learn about phase two of a surgery, we know exactly where to go to look for that segment. We don’t have to watch every minute before that. The other thing that is extraordinarily exciting to the surgeons is that in the future, we should be able to monitor the progression of the operation in real-time.”

Joining Rus on the paper are first author Mikhail Volkov, who was a postdoc in Rus’ group when the work was done and is now a quantitative analyst at SMBC Nikko Securities in Tokyo; Guy Rosman, another postdoc in Rus’ group; and Daniel Hashimoto and Ozanan Meireles of Massachusetts General Hospital (MGH).

How I Became An Expert on Companies

How to Choose the Best Water Restoration for You When you experience water damage in your home, it is important that you search for a professional water restoration organization. Thus, make sure that you use word of mouth to find a reputable water restoration company. Another question that you can ask your family members is whether they received the services on time and if the professionals treated them with respect. You should not feel like you are supposed to hire the professional that has been recommended to you by your friends and relatives since the needs of your friends might be different from your needs. Thus, make sure that you take your time when doing your investigation in order that you might make a good choice. Keep in mind that water can easily damage a lot of your belongings and hence you should search for the water restoration organization that will easily find a solution to your problems. There are many companies that offer the kind of services that you are looking for and hence it is important that you take your time so that you can find the best choice for you. That means that if you pipes burst in the middle of the night and you cannot reach your water restoration company then you will be greatly disappointed. You should strive to find a company that will offer services that are worth your money. Another important element that you should consider is the price of the water restoration services that you are investigating. A professional water restoration will assess your problem and then inform you about the steps that they intend to take and the amount of money that you will have to pay. if you pick a company without doing your research then you might realize that there are others that are more affordable. Therefore, ensure that you do not make your final choice depending on the element of cost. It is important that the professional that you will pick spends some time with you asking you about your goals and what you expect from them. That means that it is important that you search for an organization that has an online presence. That demonstrates that the company that has a strong reputation will put effort to offer superior services so that it can safe guard its reputation. Therefore, ensure that you ask for names of former customers in order that you might confirm the quality of services that you should expect and whether they received good services from the water restoration company that you want to pick.Questions About Professionals You Must Know the Answers To

Questions About Professionals You Must Know the Answers To

6 Facts About Options Everyone Thinks Are True

Easy and Simple Guide to Insurance You have to think about getting the right kind of insurance because it will help you get a much better financial planning. It is common for people to have insurance but it is also common for the majority of the people having insurance actually have no idea what it is. For some people, they see insurance as a form of investment that will help you have a superb tax saving avenue. Some people just do not get it, when you ask an average person about his or her investment, he or she would normally answer one of his or her insurance product. You should know that there are places that have at least five percent of insured individuals but that is not the end, under the five percent, there is a low percentage for those people who are adequately insured. There are some people who are insured see insurance as that and nothing more. This is one of the financial products that have been witnessed to have rampant mis selling at the hands of these so-called professional agents. They have been selling financial products enthusiastically that they link insurance to investment and they get a lot of commission. You have to know the deeper details about insurance.
Smart Ideas: Policies Revisited
People will have financial risks, especially business entities and with insurance, it helps these people companies get back up. Large group of people or a business entity will eventually experience an unfortunate event and with insurance, any predefined occurrence can help a business or a group. The cost of being insured will be paid depending on the rules of the insurance company, some will want a monthly compensation or may it be an annual thing. The money that you handed over to the insurance company for compensation will not be retrieved when the predefined event will not occur on the specified period of time. Basically, you can look at it this way, insurance is actually means of spreading risk among a group of individuals who are insured. Insurance will help people with financial burdens and make it a bit lighter for them to manage.
Businesses Tips for The Average Joe
If you follow this guide, you will be able to understand that the whole process is actually pretty easy, just make sure that you follow the guide in an orderly manner, never skip a step so that you will not have any issues about it in the future. You have to make sure that you understand the whole process before you jump into that train, you need to know that with this endeavor, you have to understand the pros and the cons to it so that you will not be shocked in the future.

What I Can Teach You About Telephones

Factors to Consider When Choosing a Business Phone System To improve communication in your business, you need a phone system with various advanced features. For example, a good business phone system will allow calls to be routed quickly to the right department. Moreover, there should be a way to track a customer’s calling history for better service provision. There are different business systems in the market. To determine which system will be right for you, it’s important to research well. Before you buy a system, it helps to know your business requirements. One of the things you should consider is the communication problem you want to solve. The aim of overhauling your communication system is to improve the overall productivity of your business and keep costs down. Make sure the system you go for will enable your business to meet these goals. Technology is another important thing to consider when looking to buy a phone system. Most business phone systems in the past used analog technology to transmit sound signals. Analog technology is not reliable and produces a lot of static. Most of the business phone systems you will find in the market today use digital technology. With digital technology, crisp sounds can be transmitted clearly over long distances.
Systems Tips for The Average Joe
Today’s business phone systems also use IP technologies. Unlike analog technology that relies on transmission of audio data over copper wires, IP technology allows transmission of the data over the internet. IP technology is cheap and hence businesses can enjoy superior communication at a low cost. Before choosing a business phone system, find out whether it has VoIP capability. PABX and PBX are the other common modern technologies used by business phone systems.
Discovering The Truth About Services
What is Your Budget? It is also important to know how much the business phone system will cost your business. Depending with the system you choose, you may have to pay a one-time fee or recurring fees. Consider your budget to know which phone system you can afford. Consider how much you will pay to buy and install the phone system. This is important as you do not want a system that will not be cost-effective for your business. When working out the costs, factor in any maintenance fees you will have to pay. If you will need any premium features, find out whether there will be extra fees to pay for them. Find out what kind of ROI the business phone system will result into. Ideally, you should be able to recoup your investment within a year of installing the business phone system. To find the right business phone system for you, it’s important to carry out proper research. Remember, a system that works for another business may not necessarily work for you. Carefully consider your business needs to know the right phone system to buy. The above is an overview of some tips to keep in mind when choosing a business phone system.

Discovering The Truth About Options

Choosing The Right Online High School Class in Michigan In the modern era, education is more important than ever before. If you want to succeed, you need to learn new things. If you’re interested in receiving an education, know that you have options. Years ago, it was actually very difficult to get an education. Today, though, this is no longer true. If you’re serious about continuing your education, you will probably want to take an online high school class. This is a very convenient option, and it’s also very affordable. As you are no doubt aware, though, no two online high school classes are ever completely identical. It’s your job to find a Michigan online high school class that will work for you. You need to know what you’re looking for if you expect to find a good class. As you may imagine, price is very relevant. Be aware that a good online high school class does not need to be prohibitively expensive. You’ll also want to consider accreditation. You will want to know that you are receiving a valid diploma for your work. Never forget that if you want to learn new things, it only makes sense to attend a Michigan online high school class. Before you choose a Michigan online high school class, you’ll want to think about the concept of support. Remember that you do not want to go through everything on your own. The truth is that it takes a team to be successful. If you have a question, it should be answered immediately. Be aware that experience is very pertinent. Remember that teaching isn’t easy. It’s important to find a program with a proven track record. If you’re serious about continuing your education, you owe it to yourself to take online high school classes in Michigan.
The Ultimate Guide to Options
There are actually many benefits to taking online high school classes in Michigan. It should be stated that these classes are actually very flexible. Decades ago, going to school was very challenging. You had to go to a specific location at a set time. Today, though, this isn’t true. When you take classes online, it’s relatively easy to create your own schedule. This will give you the flexibility that you need to live comfortably. Never forget that if you want to educate yourself, you will need to take online classes in Michigan.
The Ultimate Guide to Options
If you’re interested in taking online classes, you’ll want to first set some goals. You cannot think that good things will just randomly occur. It’s up to you to create the future that you deserve. If you care about your quality of life, it only makes sense to take online high school classes in Michigan.

3 Professionals Tips from Someone With Experience

Choosing the Right Mobile Personal Injury Lawyer

Occasionally an unexpected injury caused by the negligence of another person, can cause severe and lasting outcomes on your health, personal life and bank account. This can be a tough time in your life that can make you think you are helpless and suffering financially. Personal injury attorneys are experienced with situations like yours and can immediately let you know whether it pursuing legal action is a good idea before you invest so much time and effort. An experienced personal injury attorney knows the particular laws that apply to your case. There has never been a more essential time for you to make sure that you are represented by a personal injury attorney. Without a knowledge of the law regarding these types of cases and a vested party fighting for your interests, you may end up walking away with a lot less than you are actually entitled to. Below are some of the most factors that you should keep in mind when you are searching for the best Mobile personal injury lawyer.

Make Your First Appointment

Many lawyers can give you a free initial consultation to study your case, generally around 30 to 45 minutes. Make sure you have the necessary paperwork and documentation with you so you will be ready to give a broad overview of your case. Prioritize the professionalism of the lawyer and his staff more than the quality of the office. In view of the fact that fees pay for everything, a plain office may signify that the lawyer is prudent with money and does not waste it on appearances. What is more vital is the friendliness and efficiency of the lawyer and his staff.
Incredible Lessons I’ve Learned About Professionals

Check His Experience
Why not learn more about Experts?

Ask about how many years they have been practicing personal injury law. It is smart to select someone with at least a few years of experience in this specific field. Lawyers who represent employers and insurance agencies may not be able to represent offended parties like you with as much as success. Ask if there are conflicts of interest. The lawyer may not be able to effectively argue your case if he or she represents opposing parties. If you are not sure what kind of client the lawyer represents, call his office and inquire.

Check Rapport

A personal injury lawyer will provide you with emotional support, protect your financial interests, decrease risks, and ultimately save you money in the long run. Hiring a personal injury lawyer helps you be more positive when you have a professional on your side with knowledge of the laws and procedures concerning your claim.

The clutter in online conversations

From Reddit to Quora, discussion forums can be equal parts informative and daunting. We’ve all fallen down rabbit holes of lengthy threads that are impossible to sift through. Comments can be redundant, off-topic or even inaccurate, but all that content is ultimately still there for us to try and untangle.

Sick of the clutter, a team from MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL) has developed “Wikum,” a system that helps users construct concise, expandable summaries that make it easier to navigate unruly discussions.

“Right now, every forum member has to go through the same mental labor of squeezing out key points from long threads,” says MIT Professor David Karger, who was senior author on a new paper about Wikum. “If every reader could contribute that mental labor back into the discussion, it would save that time and energy for every future reader, making the conversation more useful for everyone.”

The team tested Wikum against a Google document with tracked changes that aimed to mimic the collaborative editing structure of a wiki. They found that Wikum users completed reading much faster and recalled discussion points more accurately, and that editors made edits 40 percent faster.

Karger wrote the new paper with PhD students Lea Verou and Amy Zhang, who was lead author. The team presented the work last week at ACM’s Conference on Computer-Supported Cooperative Work and Social Computing in Portland, Oregon.

How it works

While wikis can be a good way for people to summarize discussions, they aren’t ideal because users can’t see what’s already been summarized. This makes it difficult to break summarizing down into small steps that can be completed by individual users, because it requires that they spend a lot of energy figuring out what needs to happen next. Meanwhile, forums like Reddit let users “upvote” the best answers or comments, but lack contextual summaries that help readers get detailed overviews of discussions.

Wikum bridges the gap between forums and wikis by letting users work in small doses to refine a discussion’s main points, and giving readers an overall “map” of the conversation.

Readers can import discussions from places such as Disqus, a commenting platform used for publishers like The Atlantic. Then, once users create a summary, readers can examine the text and decide if they want to expand the topic to read more. The system uses color-coded “summary trees” that show topics at different levels of depth and lets readers jump between original comments and summaries.

Creative approaches to connectivity

Daniel Zuo came to MIT with a plan: He wanted to study algorithms and one day to become a research professor.

The senior has more than accomplished the former goal, conducting innovative research on algorithms to reduce network congestion, in the Networks and Mobile Systems group at MIT’s Computer Science and Artificial Intelligence Laboratory (CSAIL). And, as he graduates this spring with a bachelor’s degree in computer science and electrical engineering and a master’s in engineering, he is well on his way to achieving the latter one.

But Zuo has also taken some productive detours from that roadmap, including minoring in creative writing and helping to launch MakeMIT, the nation’s largest “hardware hackathon.”

The next step in his journey will take him to Cambridge University, where he will continue his computer science research as a Marshall Scholar.

“The Marshall affords me the opportunity to keep exploring for a couple more years on an academic level, and to grow on a personal level, too,” Zuo says. While studying in the Advanced Computer Science program at the university’s Computer Laboratory, “I’ll be able to work with networks and systems to deepen my understanding and take more time to explore this field,” he says.

Algorithms to connect the world

Zuo fell in love with algorithms his first year at MIT. “It was exactly what I was looking for,” he says with a smile. “I took every algorithms course there was on offer.”

His first research experience, the summer after his freshman year, was in the lab of Professor Manolis Kellis, head of the Computational Biology group at CSAIL. Zuo worked with a postdoc in Kellis’ group to use algorithms to identify related clusters of genes in a single cell type within a specific tissue. “We ended up coming up with a pretty cool algorithm,” he says.

As a research assistant for TIBCO Career Development Assistant Professor Mohammad Alizadeh, Zuo is now working on cutting-edge algorithms for congestion control in networks, with a focus on “lossless” data networks.

Modern computer network applications need to be able to transmit large amounts of data quickly, without losing information. Zuo likens the situation to a congested traffic light. When there are too many messages queuing at the light, some information just gets dropped.

“When the traffic light starts to get too full, I can send a packet back upstream that says ‘Wait, if you’re going to send me something, don’t,’” he explains. But sending that signal can create a new problem: a “back-propagation” of even more pauses, and more congestion upstream. Zuo’s algorithms aim to solve both of these problems, ensuring that sent data are never lost and that “traffic lights” don’t become too crowded.

Communication networks from malicious hackers

Distributed planning, communication, and control algorithms for autonomous robots make up a major area of research in computer science. But in the literature on multirobot systems, security has gotten relatively short shrift.

In the latest issue of the journal Autonomous Robots, researchers from MIT’s Computer Science and Artificial Intelligence Laboratory and their colleagues present a new technique for preventing malicious hackers from commandeering robot teams’ communication networks. The technique could provide an added layer of security in systems that encrypt communications, or an alternative in circumstances in which encryption is impractical.

“The robotics community has focused on making multirobot systems autonomous and increasingly more capable by developing the science of autonomy. In some sense we have not done enough about systems-level issues like cybersecurity and privacy,” says Daniela Rus, an Andrew and Erna Viterbi Professor of Electrical Engineering and Computer Science at MIT and senior author on the new paper.

“But when we deploy multirobot systems in real applications, we expose them to all the issues that current computer systems are exposed to,” she adds. “If you take over a computer system, you can make it release private data — and you can do a lot of other bad things. A cybersecurity attack on a robot has all the perils of attacks on computer systems, plus the robot could be controlled to take potentially damaging action in the physical world. So in some sense there is even more urgency that we think about this problem.”

Identity theft

Most planning algorithms in multirobot systems rely on some kind of voting procedure to determine a course of action. Each robot makes a recommendation based on its own limited, local observations, and the recommendations are aggregated to yield a final decision.

A natural way for a hacker to infiltrate a multirobot system would be to impersonate a large number of robots on the network and cast enough spurious votes to tip the collective decision, a technique called “spoofing.” The researchers’ new system analyzes the distinctive ways in which robots’ wireless transmissions interact with the environment, to assign each of them its own radio “fingerprint.” If the system identifies multiple votes as coming from the same transmitter, it can discount them as probably fraudulent.

“There are two ways to think of it,” says Stephanie Gil, a research scientist in Rus’ Distributed Robotics Lab and a co-author on the new paper. “In some cases cryptography is too difficult to implement in a decentralized form. Perhaps you just don’t have that central key authority that you can secure, and you have agents continually entering or exiting the network, so that a key-passing scheme becomes much more challenging to implement. In that case, we can still provide protection.

“And in case you can implement a cryptographic scheme, then if one of the agents with the key gets compromised, we can still provide  protection by mitigating and even quantifying the maximum amount of damage that can be done by the adversary.”

Hold your ground

In their paper, the researchers consider a problem known as “coverage,” in which robots position themselves to distribute some service across a geographic area — communication links, monitoring, or the like. In this case, each robot’s “vote” is simply its report of its position, which the other robots use to determine their own.

The paper includes a theoretical analysis that compares the results of a common coverage algorithm under normal circumstances and the results produced when the new system is actively thwarting a spoofing attack. Even when 75 percent of the robots in the system have been infiltrated by such an attack, the robots’ positions are within 3 centimeters of what they should be. To verify the theoretical predictions, the researchers also implemented their system using a battery of distributed Wi-Fi transmitters and an autonomous helicopter.

Prevent customer profiling and price gouging

Most website visits these days entail a database query — to look up airline flights, for example, or to find the fastest driving route between two addresses.

But online database queries can reveal a surprising amount of information about the people making them. And some travel sites have been known to jack up the prices on flights whose routes are drawing an unusually high volume of queries.

At the USENIX Symposium on Networked Systems Design and Implementation next week, researchers from MIT’s Computer Science and Artificial Intelligence Laboratory and Stanford University will present a new encryption system that disguises users’ database queries so that they reveal no private information.

The system is called Splinter because it splits a query up and distributes it across copies of the same database on multiple servers. The servers return results that make sense only when recombined according to a procedure that the user alone knows. As long as at least one of the servers can be trusted, it’s impossible for anyone other than the user to determine what query the servers executed.

“The canonical example behind this line of work was public patent databases,” says Frank Wang, an MIT graduate student in electrical engineering and computer science and first author on the conference paper. “When people were searching for certain kinds of patents, they gave away the research they were working on. Stock prices is another example: A lot of the time, when you search for stock quotes, it gives away information about what stocks you’re going to buy. Another example is maps: When you’re searching for where you are and where you’re going to go, it reveals a wealth of information about you.”

Honest broker

Of course, if the site that hosts the database is itself collecting users’ data without their consent, the requirement of at least one trusted server is difficult to enforce.

Wang, however, points to the increasing popularity of services such as DuckDuckGo, a search engine that uses search results from other sites, such as Bing and Yahoo, but vows not to profile its customers.

“We see a shift toward people wanting private queries,” Wang says. “We can imagine a model in which other services scrape a travel site, and maybe they volunteer to host the information for you, or maybe you subscribe to them. Or maybe in the future, travel sites realize that these services are becoming more popular and they volunteer the data. But right now, we’re trusting that third-party sites have adequate protections, and with Splinter we try to make that more of a guarantee.”

The number of exposures necessary

Compressed sensing is an exciting new computational technique for extracting large amounts of information from a signal. In one high-profile demonstration, for instance, researchers at Rice University built a camera that could produce 2-D images using only a single light sensor rather than the millions of light sensors found in a commodity camera.

But using compressed sensing for image acquisition is inefficient: That “single-pixel camera” needed thousands of exposures to produce a reasonably clear image. Reporting their results in the journal IEEE Transactions on Computational Imaging, researchers from the MIT Media Lab now describe a new technique that makes image acquisition using compressed sensing 50 times as efficient. In the case of the single-pixel camera, it could get the number of exposures down from thousands to dozens.

One intriguing aspect of compressed-sensing imaging systems is that, unlike conventional cameras, they don’t require lenses. That could make them useful in harsh environments or in applications that use wavelengths of light outside the visible spectrum. Getting rid of the lens opens new prospects for the design of imaging systems.

“Formerly, imaging required a lens, and the lens would map pixels in space to sensors in an array, with everything precisely structured and engineered,” says Guy Satat, a graduate student at the Media Lab and first author on the new paper.  “With computational imaging, we began to ask: Is a lens necessary?  Does the sensor have to be a structured array? How many pixels should the sensor have? Is a single pixel sufficient? These questions essentially break down the fundamental idea of what a camera is.  The fact that only a single pixel is required and a lens is no longer necessary relaxes major design constraints, and enables the development of novel imaging systems. Using ultrafast sensing makes the measurement significantly more efficient.”

Recursive applications

One of Satat’s coauthors on the new paper is his thesis advisor, associate professor of media arts and sciences Ramesh Raskar. Like many projects from Raskar’s group, the new compressed-sensing technique depends on time-of-flight imaging, in which a short burst of light is projected into a scene, and ultrafast sensors measure how long the light takes to reflect back.

The technique uses time-of-flight imaging, but somewhat circularly, one of its potential applications is improving the performance of time-of-flight cameras. It could thus have implications for a number of other projects from Raskar’s group, such as a camera that can see around corners and visible-light imaging systems for medical diagnosis and vehicular navigation.

Many prototype systems from Raskar’s Camera Culture group at the Media Lab have used time-of-flight cameras called streak cameras, which are expensive and difficult to use: They capture only one row of image pixels at a time. But the past few years have seen the advent of commercial time-of-flight cameras called SPADs, for single-photon avalanche diodes.

Though not nearly as fast as streak cameras, SPADs are still fast enough for many time-of-flight applications, and they can capture a full 2-D image in a single exposure. Furthermore, their sensors are built using manufacturing techniques common in the computer chip industry, so they should be cost-effective to mass produce.