Nothing can be more shocking to a woman than being diagnosed with breast cancer. Unfortunately, this is a very common disease for women when they reach their 50’s and 60’s and it also attacks younger women in some rare, but more frequent than expected cases. The fight against this condition continues to evolve as doctors are looking for the best possible way to get rid of this problem and to help women all over the world fight the condition to the best of their abilities.
There are some medical breakthroughs that are helping this fight and helping women all over the world fight this disease with the best possible outcome. The Vector Surgical’s margin marker is a very powerful method that is helping doctors find new ways to detect and study the way that cancer behaves. This is an amazing method that is going to prove to be outstanding and very useful without a doubt. More and moredoctors are using this method to help their patients by keeping track of the condition in a more controlled situation that allows for accurate targeting of any recurrence.
This sort of system is undoubtedly going to help engage more patients and shine a stronger light on this condition that seems to be so difficult to understand. Know this and understandinghow it behaves when a person has suffered from it before is going to be a key factor for a successful study of it. Breast cancer is a completely treatable cancer, but the most important thing to consider is that it can be hard to know if there will be any recurrence and what can be done when you need to have any kind of treatment.
All of the women who are suffering from this condition are going to be able to get the kind of treatment that is going to help them tremendously and this is one of the reasons why this is so valuable. This is going to change the way that the monitoring process is done in women all over the world and this data is going to help doctors understand the way that this condition behaves and hot it can be stopped from coming back once there has been a successful treatment to initially get rid of it.
The people wearing the giant goggles and laughing, ducking, jumping, and wildly waving their arms weren’t the only ones having fun.
“Watching people in visual reality is the best thing,” said a man as he waited for a chance to don a large black headset and hand-held controllers and take a turn as a professional hockey goalie deflecting virtual pucks.
“Dude, it’s addicting,” said another, fresh from a trip to the ocean floor and a close encounter with a blue whale. Across the hall, a line formed for a motion simulator that mimicked a shaking race car, taking headset-wearing users for a spin on a virtual speedway.
For a real-world observer at the Harvard Innovation Lab (i-lab) on Wednesday, the entertainment potential of visual and augmented reality was abundantly clear.
But what many consider the next great tech revolution isn’t only about fun. Experts say it has the potential to transform business, art, education, science, design, manufacturing, and medicine. It’s also expected to be worth $100 billion by 2020.
The future of visual and augmented reality was the theme of a HUBweek event that attracted scholars, students, scientists, educators, entrepreneurs, and software developers along with the merely curious for an afternoon of demonstrations and discussions.
In the coming weeks the i-lab will open its own augmented reality-virtual reality lab, said Jodi Goldstein, the facility’s Bruce and Bridgitt Evans Managing Director. Goldstein introduced the keynote address, featuring Rony Abovitz video-linked to the talk via a rolling robot. Abovitz, an entrepreneur who made his mark with computer-assisted surgery, is the man behind what Goldstein called “one of the most mysterious yet highly transformative ventures in this space”: Magic Leap Inc.
Abovitz’s firm is developing “Mixed Reality Lightfield,” a combination of virtual reality, in which users wear goggles and look into a screen that simulates an alternate universe; augmented reality, which takes a person’s view of the real world and layers on top of it things such as Pokémon characters or maps of the nearest subway stops; and light fields.
The technology fits perfectly with the brain’s spatially oriented visual-processing mechanism, said Abovitz, and will open a world of discovery to those who have struggled to transfer two-dimensional information or text into “spatial learning.”
“I think it will make life easier for a lot of people and open doors for a lot of people because we are making technology fit how our brains evolved into the physics of the universe rather than forcing our brains to adapt to a more limited technology,” said the Magic Leap president and CEO, whose hopes for the technology include, among other things, parking a virtual “Star Wars” X-wing in his driveway.
A range of discussions highlighted the practical applications of virtual reality. In a session on surgery and rehabilitation, Jayender Jagadeesan, an assistant professor at Harvard Medical School, described research at Brigham and Women’s Hospital in which a modified Oculus Rift augmented-reality headset is helping surgeons determine a tumor’s exact location.
“You can pull in any of the patient’s specific imaging, while the surgeon is actually doing this procedure,” Jagadeesan said of the multiple diagnostic screens a physician can instantly access while wearing the headset during an operation.
“You can be in a classroom in the middle of the winter, but psychologically you are at a pond or a forest in the middle of the summer,” said Dede, the Timothy E. Wirth Professor in Learning Technologies, of ecoMUVE, a technology that uses immersive virtual environments to teach students about delicate ecosystems.
In the i-lab’s exhibit hall visitors had access to a range of experiences and applications, including a round rolling camera developed to help first responders assess dangerous situations and a wearable device that lets users feel virtual objects.
Among the Harvard undergrads at the event was Madeleine Woods ’19, who was on site with Harvard College Virtual Reality, or Convrgency. The organization aims to bring students together to explore the creative potential of virtual reality.
Woods, a joint concentrator in folklore and mythology and English with a secondary in archaeology, described herself as a theater kid who loves the arts and humanities along with coding and technology. She said virtual reality merges her varied interests, calling it both “empathetic” and “humanistic.”
“This is the first time I think technology really opens people up,” said Woods. “You can experience the life of someone else across the world. You can see something you’ve never [seen in person]. I think you can understand people so much better when you can walk in their shoes, and this is something that literally puts that technology in hand.”
When winter temperatures drop to frigid in Cambridge, the air inside some rooms at Eliot House soars to downright tropical.
That’s because Eliot, an upperclassman dormitory built in 1931, uses a steam-driven heat exchanger to pump hot water through the building whenever the outdoor temperature drops below 48 degrees. To ensure that enough steam reaches radiators at the end of the line, radiators in rooms closer to the input get hotter than necessary.
With limited temperature controls in their dorm rooms, some sweltering students resort to cracking windows to let some of the heat escape. Aldís Elfarsdóttir ’18, an environmental science and engineering concentrator at the John A. Paulson School of Engineering and Applied Sciences (SEAS), didn’t like the environmental implications of that.
Curious about the impact of wasting all that energy, she took on an extracurricular project through her work with the Harvard Office for Sustainability (OFS) to quantify the amount of energy flying out the windows during wintertime. This inefficiency is one reason undergraduate Houses are undergoing a renewal that includes state-of-the-art heating systems and energy-efficient windows.
Working with Siemens energy engineer Christopher Bitzas, Elfarsdóttir discovered that if all its windows were kept closed through the winter, Eliot House could save 358 million BTUs of thermal energy — slightly more energy than an average person in the United States consumes during an entire year. Based on the cost of purchasing steam, closing the windows would save nearly $14,000 each winter.
With that data in mind, Elfarsdóttir attended a design-thinking workshop at SEAS, organized in conjunction with OFS. There she met Patrick Kuiper, M.E. ’16, then an applied mathematics master’s student, who introduced her to Patrick Day, S.M. ’16, an engineering sciences master’s student. Together, they launched a data-gathering project to help Eliot House conserve energy. The Faculty of Arts and Sciences Office of Physical Resources and Planning provided funding, and OFS advised the team.
The project involved installing Intel Edison Internet of Things development boards, retrofitted with temperature and humidity sensors, into 15 Eliot House dorm rooms to gather real-time environmental data.
“We had been using these devices for fun, and then Aldís came along and had a great application for them,” Kuiper said. “These simple devices give us a way to quantitatively analyze people’s temperature perceptions.”
Fine-tuning the miniature computers and connecting them to Amazon Web Service to collect and organize data was an iterative process that involved its share of trial and error, Day said.
But the biggest challenge the team faced came when they arrived at Eliot House in early August to install the devices. Due to a scheduling conflict with a move-in day, they had less than 24 hours to set up all 15 computers. They had to work late into the night to installing the sensors in all four floors of the House.
The devices now provide temperature and humidity information twice an hour. The team intends to use that data, in conjunction with qualitative input from resident surveys, to help students select rooms they are likely to find more comfortable.
So far, 106 residents have completed a survey that asks their temperature preferences and demographic background. In the spring, Elfarsdóttir will survey residents again to determine whether their room felt too hot or too cold for them during the winter.
That data will lay the foundation for a model that can be used to make suggestions to students when it comes time for room selection, said Kuiper. For instance, a student who hails from the Deep South and loves beach weather might be more comfortable in a room that gets warmer in winter, he said.
“I am so excited to see this project come to life,” Elfarsdóttir said. “Our hope is that, by increasing occupant comfort, we can simultaneously save energy because there will be reduced window-opening during winter.”
In addition to reducing energy usage, the team hopes the data generated will contribute to other research projects, at Harvard and beyond.
“Maybe this project will help inform the upcoming renovations at Eliot House,” Day said. “Hopefully, this will provide some real data that will help decision-makers select a heating system that will work better for the building.”
Heather Henriksen, who directs the OFS, said one of her office’s primary goals is to facilitate projects like this that use the campus as a test bed for student and faculty research.
For Elfarsdóttir, it was especially rewarding to work on a project that could impact the future of her House, which is due to be renovated in three years.
As she travels through narrow hallways and up creaky flights of stairs, checking the sensors and chatting with housemates about the project’s progress, it’s clear to Elfarsdóttir that stately Eliot House has become a living laboratory.
“This project has impressed upon me how data can show us, in a completely quantitative way, how we are interacting with our living environment,” she said.
The human field of vision is only about 180 degrees, so if you’re reading this at your desk, you should have a good view of the stuff that’s right in front of you — your computer and phone, maybe some pictures of your family.
Despite that limited view, your brain is able to stitch together a coherent 360-degree panorama of the world around you, and now researchers are beginning to understand how.
Harvard scientists have pinpointed two regions in the brain’s so-called scene network — the retrosplenial cortex (RSC) and the occipital place area (OPA) — and demonstrated that they share nearly identical patterns of neural activation when people are shown images of what is in front of and behind them. The finding suggests that these regions play a key role in helping humans understand their visual environment. The study is described in an Aug. 26 paper in Current Biology.
“We have a limited field of view — we can only see what’s immediately in front of us,” said lead author Caroline Robertson, a junior fellow of the Harvard Society of Fellows. “And yet you have a very detailed visual memory, particularly in a familiar place like your office.
“We know there are cells in the brain — like head direction cells — that maintain a representation of your spatial position in the environment around you,” she added. “Yet your visual system, all we know is how it responds to what’s in your current field of view. What we wanted to get at is the intersection between those two, between memory and perception.”
Though scientists have long understood that certain brain regions are involved in processing scenes as opposed to faces or bodies, specifying regions involved in merging the images we see moment-by-moment into a coherent view of the world demanded some creative thinking — and some gaming hardware.
As part of a series of tests, Robertson and colleagues used virtual-reality goggles to enable volunteers to explore panoramic images of Boston’s Beacon Hill neighborhood. The first was largely proof-of-concept. Volunteers donned VR goggles and were shown a series of panoramic images. Some saw a single continuous image while others saw images that contained a gap.
Image with Gap
When participants were later shown pairs of snapshots, researchers found that those who had seen the continuous panorama were better able to identify images that were across the street from each other.
Next, participants were placed in an MRI scanner and asked whether images came from the left or right side of the street. Over 90 minutes, researchers collected dozens of measurements showing patterns of brain activity, and later analyzed those patterns hoping to find similarities.
“What we were looking for in our analysis was whether neural activity for images that were across the street from each other looked similar,” Robertson said.
Armed with that MRI data, Robertson and colleagues found that while one part of the brain’s “scene network” — the parahippocampal place area (PPA) — reacted in the same way regardless of the scene, the RSC and OPA showed similar activation patterns for images that were connected, suggesting that they played a role in constructing panoramic images.
The final test, Robertson said, was nearly identical, but before asking participants whether an image came from the left or right, researchers briefly flashed a “prime” image — a view down the street — with the expectation that it would trigger people’s memories of the full panorama.
“Once you form that association in the brain, and you have an overlap between images from either side of the street, I expect that if I show you image one, I’m implicitly triggering image two,” Robertson explained. “So we found people were faster and more accurate if they saw the priming image versus seeing a totally different panorama or no prime at all.”
The study offers new insight into how vision and memory work together to inform our understanding of the world around us, she added.
“Even though we only get these very discrete snapshots of our world, and even though that snapshot is interrupted when we blink, it doesn’t feel as though our mental image of our environment is constantly going on- and off-line,” Robertson said. “We feel a smooth, consistent representation of the world that we are interacting in, and there’s evidence that, for that to happen, there needs to be a hub, somewhere in the brain, where your current field of view interacts with your memory of what’s around you, and that’s what we’re putting together in these regions of the brain.”
Many women report forgetfulness and changes in memory as they transition to menopause. But studies that target participants who are 65 and older do not account for cognitive changes that may take place decades earlier in a woman’s life.
By studying women ages 45 to 55, investigators at Harvard-affiliated Brigham and Women’s Hospital (BWH) have found that reproductive stage, not simply chronological age, may contribute to changes in memory and brain function. Their findings will be published Wednesday in The Journal of Neuroscience.
“We set out to study cognitive aging from a women’s health perspective. One of the most profound hormonal changes in a woman’s life is the transition to menopause. By shifting our focus to this midlife period, we detected early changes in memory circuitry that are evident decades before the age range traditionally targeted by cognitive neuroscience studies on aging,” said lead author Emily Jacobs, a former member of the Division of Women’s Health and the Department of Psychiatry at BWH. “Aging isn’t a process that suddenly begins at 65. Subtle neural and cognitive changes happen earlier. Considering a person’s sex and reproductive status — above and beyond numerical age — is critical for detecting those changes.”
The research team studied 200 men and women, using functional MRI to look at regional and network-level changes in the brain’s memory circuitry. Participants performed a task that tested their verbal memory: They were shown two words on a screen and asked to form a sentence using them, then were later tested on their memory for the words. They also collected information on the female participants’ menopausal status and measured steroid hormone levels, including 17β-estradiol, a sex steroid hormone that declines during menopause.
Overall, the researchers found that when estradiol levels were lower, more pronounced changes in the hippocampus — one of the primary regions of the brain implicated in learning and memory — were seen, and participants with lower levels of the hormone performed worse on the memory task.
The team also evaluated high-performing postmenopausal women, finding that they exhibited brain activity patterns that resembled the activity of premenopausal women.
“Our findings underscore the incredible variability of the brain as we age and the critical importance and complexity of the impact of sex on aging, including the unique role of sex steroid hormones in memory function,” said senior author Jill Goldstein, director of research at the Connors Center for Women’s Health and Gender Biology at BWH and a professor of medicine at Harvard Medical School. “Maintaining intact memory function with age is one of the greatest public health challenges of our time, and applying a sex-dependent lens to the study of memory circuitry aging will help identify early antecedents of future memory decline and risk for Alzheimer’s disease.”
This work was supported by a National Institute of Mental Health grant to Goldstein, the principal investigator, and a junior faculty career development award to Jacobs from the Office of Research on Women’s Health, the National Institute of Child Health and Human Development, and the Harvard Clinical and Translational Science Center.