The commercial 100G transmission system recently announced by Nortel, which has been adopted by the US carrier Verizon. Credit: NORTEL
This year looks set to be the point at which optical networks operating at data rates of 100 Gbit s−1 (100G) per wavelength channel become a commercial reality. Although 40G networks are still just starting to be rolled-out, there has been considerable activity in the 100G market, and analyst company Ovum believes that 2010 will see the first revenue-generating deployments of 100G technology. It estimates that the global demand for transponders (integrated transmitter–receiver units) for 100G dense-wavelength-division multiplexing communication systems will grow rapidly over the next five years, reaching $455 million by 2014. This summer, the IEEE standards for 40G and 100G Ethernet technologies will be ratified, after which the market can start to grow in earnest.
The end of 2009 saw a plethora of announcements regarding 100G trials and deployments from telecommunications operators such as Verizon and Deutsche Telekom, and system vendors such as Huawei, Nortel, Ericsson and Alcatel-Lucent. The US operator Verizon claimed it was the first telecommunications carrier to successfully deploy a commercial 100G ultralong-haul optical system for live traffic between Paris and Frankfurt, based on Nortel technology. At around the same time, Huawei announced the successful completion of a 100G long-haul transmission field trial with Telefónica in Spain, covering a distance of more than 1,000 km without electrical regeneration. Ericsson and Deutsche Telekom announced a joint 100G field trial on an existing optical platform as part of the European 100GET project, and Nortel made several 100G announcements including a successful trial over a 600-km 100G link spanning from New York to Boston.
Developers of 40G and 100G technology are of course acutely aware of the need to keep costs down. For this reason, there is a strong drive to make the technology compatible with existing fibre infrastructure.
“Our coherent frequency-division multiplexing solution includes digital signal processing, enabling us to achieve the full 112 Gbit s−1 capacity using 10G-class components. Rather than using intensity-modulated direct detention, Nortel is using dual-polarization quadrature phase shift keying with coherent receivers, together with advanced digital signal processing,” says Helen Xenos, 40G/100G product marketing manager at Nortel.
Dual-polarization quadrature phase shift keying is a modulation format that effectively sends four times as much information as traditional optical transmissions of the same speed. When paired with a coherent receiver that can detect this modulation format, the optical transmission rate can be slowed, which reduces the effects of signal distortions such as chromatic dispersion and polarization mode dispersion. Any remaining signal distortion resulting from dispersion is eliminated by integrated electronic dispersion-compensation technology, which adjusts for distortion at the receiver-side of the transmission. By using these advanced signal processing technologies, Nortel's solution can transmit a 100G signal — even over an impaired fibre that cannot be used for traditional 10G transmission — without the need for separate compensators for chromatic dispersion and polarization mode dispersion.
“Regardless of the modulation format used, a critical element for achieving 100G transmission over long- and ultralong-haul distances is to decrease the transmitted baud rate so that optical impairments such as dispersion are controlled, and to have an effective coherent receiver to be able to recover the modulated signal,” says Ron Kline, principal analyst for network infrastructure at Ovum.
For their trial, Ericsson and Deutsche Telekom used an existing link of mixed 10G and 40G traffic with a 50 GHz channel spacing. A line rate of 112 Gbit s−1 was achieved using polarization-multiplexed return-to-zero differential quadrature phase shift keying over 600 km of standard single-mode fibre. The link included multiple reconfigurable optical add-drop multiplexers and amplifiers. With carefully optimized links, transmission distances of more than 1,200 km are possible and have been demonstrated in the lab.
Huawei's 100G field trial in Spain was based on the company's OSN 6800 wavelength-division multiplexing/optical transport network platform and standard G.652 optical fibre. The link featured ten sets of reconfigurable optical add-drop multiplexers, 33 optical amplifiers and two sets of multiplexer–demultiplexers. Transmission of a mixture of 10G/40G/100G data with a 50 Hz channel spacing was achieved without interfering with the existing network.
Alcatel-Lucent used proprietary digital signal processing algorithms to optimize coherent detection of data in its 100G tests. The 112 Gbit s−1 field trial spanned 1,088 km between four cities in Spain and took place over an existing heavily loaded fibre link carrying live traffic on the Telefónica network.
“Although it is exciting to see so many successful 100G trials, the technology must be cost-effective before it will take off,” warns Kline. “The aim with 40G was to make it 2.5 times the cost of 10G, while giving four times the bandwidth. We are not yet at that stage with 40G, and prices for 10G are dropping. So many companies are questioning whether or not to invest in 40G, let alone 100G. The 100G solution needs to be more cost-effective than ten separate 10G wavelengths, and we are several years away from achieving this.”
CEO & Founder of National Business Capital, the leading fintech marketplace offering streamlined small business loans.
getty
Almost everyone has heard, “It’s not personal; it’s just business.” While this phrase sounds okay on the surface, adopting this belief is actually more damaging than you’d think for your employees and customers.
Business is inherently personal because companies are made up of people who aren’t interested in one-size-fits-all approaches. No one wants to feel like a cog in a wheel, which is why taking a personal approach in business often leads to better performance and greater satisfaction in your work.
If you want to take a personal approach in your business, this starts with how you treat your employees and customers. If these relationships aren’t as strong as you would like them to be, here are some strategies for improving them.
Salary will always play a role in an employee’s job satisfaction, but now, a higher salary doesn’t have as much negotiation power as before. The “great resignation” has forced many companies to see that their old ways aren’t cutting it in the current environment. It isn’t just recommended to take a more personal approach to your business relationships—it’s a necessity to keep your team from migrating to the competition.
Employees need to feel that their work has meaning and, more importantly, see how it contributes to the greater good. Here are a few ways you can begin taking a personal approach with your employees:
Be transparent: There’s nothing more frustrating than working in a job where it feels like management is constantly withholding information. You’re not protecting your employees from anything—you’re creating unnecessary anxiety in the office. Be honest with your team and let them know what’s happening in the business—they’ll be more committed to the company because of it.
Provide opportunities to advance: It’s hard to experience job satisfaction if you don’t feel like you’re growing and getting better at what you do. Look for ways to deliver your employees opportunities to advance, and talk to them about new positions that will be available as the company grows.
Remember birthdays: Don’t let staff birthdays come and go without acknowledging them. Mark the dates of all your employees’ birthdays in your calendar and order them a cake, or something similar, to celebrate. It may seem like a small gesture, but it will go a long way toward showing your employees you care about them.
Your customers drive your business, so you need to consider their interactions with your company from their point of view. Staying connected to your customers and showing them you care about their opinion will build long-term brand loyalty, much like it would if you were in their shoes.
One of the easiest ways to do this is by simply thanking your customers for their business. If you’re a small business, you may be able to call each customer personally and thank them for their support.
Another option is to send cards thanking your customers for their business. You can also send holiday cards to show your appreciation, but you don’t just want to engage your customers when things are going well—it’s just as important to reach out when there’s a problem.
Instead of seeing complaints as a hassle, use them as opportunities to strengthen the relationship with your customers. Mistakes are inevitable, and when you apologize and do what you can to fix the problem, it builds trust with your customers.
As a business owner, you need to find ways to motivate and inspire your employees. Happy employees will be more productive, more engaged with their work, and more creative, which can also lead to lower employee turnover rates and help your bottom line.
Your goal with each customer is to increase the customer lifetime value (CLV). A high CLV means that customer brings in more revenue for your business. By building credibility and trust with your customers, you’ll lower your customer churn and, of course, make each customer more impactful for your longevity.
As technology becomes more advanced, it’s easy for businesses to lose sight of what really matters. We can automate processes and communicate with our team/customers through apps for convenience, but if we don’t focus on the human component of our relationships, simplicity becomes much less important.
When you take a personal approach in business, you treat your employees and customers as individuals and look for personalized solutions to every problem. You look beyond your CRM and Slack to find ways to build strong relationships—an old approach to a new challenge. Take the time, put yourself in someone else’s shoes, and strategize to implement systems that benefit your team and customers just as much as your profit line.
Forbes Finance Council is an invitation-only organization for executives in successful accounting, financial planning and wealth management firms. Do I qualify?
Substituting can be a tricky art, especially when stars are involved.
When massive stars explode, they can collapse into extremely dense — and mysterious — objects known as neutron stars. But neutron stars are too far away and much too small for even the most powerful telescopes to look inside, so scientists want to find a way to figure out what a neutron star is made of. In new research, astrophysicists tested a potential approach to determining the state of the matter inside a neutron star. (More familiar states of matter are solid, liquid and gas.)
What scientists want to know is a neutron star's equation of state, or EoS. This equation describes the properties of matter in an object or substance. But getting the precise measurements needed to solve this equation for a neutron star, especially its radius, has not been easy.
Related: Hubble Space Telescope finds neutron star collision's jet travels nearly as fast as light
So the researchers tested whether they could simplify the effort by substituting another measurement for the neutron star's radius. They turned to what scientists call the peak spectral frequency of the gravitational waves — ripples in space-time — that are emitted when neutron stars merge into one larger neutron star.
The glob of dense star stuff that remains after such a collision will spew out massive gravitational waves as it moves back and forth while rotating at breakneck speed. The signal from these waves can be picked up by the hypersensitive instruments of a gravitational wave observatory like the Laser Interferometer Gravitational-Wave Observatory (LIGO).
"At least in principle, the peak spectral frequency can be calculated from the gravitational wave signal emitted by the wobbling remnant of two merged neutron stars," Elias Most, an astrophysicist at the Institute for Advanced Study in New Jersey and co-author on the new research, said in a statement.
Until now, scientists assumed f2 could stand in for a neutron star's radius because the two values are often linked to each other. But that is not always the case, the new research determined. Instead, to make the substitution work, scientists must incorporate a second value related to the neutron star's mass and radius.
The researchers hope that this determination will help scientists shed light on a theory that the neutrons in the cores of these stars break down into even smaller subatomic particles, called quarks.
The research is described in a paper published in July in The Astrophysical Journal Letters.
Follow us on Twitter @Spacedotcom and on Facebook.
Campaign to Create Tomorrow
Learn more about The Ottawa Hospital's $500-million fundraising campaign, the largest in our city’s history, which will transform the future of healthcare in Ottawa.
Bill Edwards, a retired Army Colonel and Iraq War veteran, is President of Federal and Public Safety at Building Intelligence Inc.
getty
Hospitals are one of our society's most vulnerable and important institutions, and it is essential that security is taken seriously. Health care providers, support staff and patients should feel safe when dealing with already stressful health issues.
Evolving threat environments, however, require close attention and a deeper understanding of how a comprehensive, technology-supported, well-developed approach supports efforts to lower risk and create a more secure and safe environment. exact attacks in Little Rock, Dallas and Tulsa, for example, are just three of the events highlighting why hospitals need to think more holistically about security program development and operations, especially “trusted access management” and “physical security screening.”
All too often, trusted access management and physical-security screening are absent in our health care facilities. When attempted, they fall far short of what is possible with the technology and known processes available today. Based on exact events and historical lessons, we have an opportunity to streamline the process in a way that makes sense for hospital visitors while meeting operational requirements for security and safety.
The exact Evolution Of Security
It wasn’t too long ago that society struggled with the same issues in other public-facing facilities and businesses, such as airports, sports arenas, concerts and schools. Unfortunately, this is our current reality and what the threat environment demands.
Take, for example, the cybersecurity struggles as the world moved to digital platforms. This “cyber awakening” called for increased cyberdefense. The interest in this course increased, and both public and private institutions came together to provide risk-mitigation options.
The evolution of commercial off-the-shelf drones is another example where events in society are causing a reshaping of the way we view the threat landscape. We must be compelled to actively watch and learn from what is developing from a human and technological perspective globally and how those evolutions and maturity models are shaping the security environment.
Proactive security planning and execution is a constant and consistent rhythm of action and reaction. Stagnant programs that fail to evolve often find themselves at a disadvantage in a rapidly changing environment. Remember the adage, “There is nothing constant but change.” Security programs require constant attention and technologies need to be understood.
We should never consider security operations static, nor should our approach to security technology design and employment remain in a legacy mindset.
Clearly, technologies and processes used and tested in other similar public-facing environments can prove helpful, especially in a time when information flows in seconds and the severity of security events takes on a life of its own. Health care facilities are the most important for public spaces and those venues that bring the public together, like hospitals and health care facilities.
These campuses have grown over time, however, and require a different approach and thoughtful budget. How do we get started?
1. It’s important to have plans and operational procedures in place to address a myriad of situations.
2. Leveraging legacy technology equipment is possible but understanding how video surveillance and access control are employed is critical to understanding how integration can take shape.
For health care facilities, the threat environment has evolved beyond a camera and access control system. In today’s hospitals and health care facilities, trusted access management and physical security screening have become the third and fourth legs of the four-pillar technology approach: video surveillance system, access control, trusted access management and physical security screening.
Conducting a technology audit is not only prudent but should be considered a priority. The public demands that businesses take their “duty of care” responsibility seriously and budget properly for operational success. In a time when emerging technology and proven operational processes can change public security and safety simultaneously, pay attention.
The current conditions are telling us that we cannot sit idly in impending danger. As many doctors who are experts at removing a potential health threat, we must strive to remove security threats from the health care environment. Here is a simple framework to get started:
1. Have a third-party security professional conduct a threat, vulnerability and risk assessment (TVRA) to better understand your environment and to “see yourself” from a comprehensive security and safety posture.
2. Conduct an audit of your technology platforms and discover what is possible from an upgrade and integration perspective. Technologies have come a long way in the last few years as it pertains to “working together.”
3. Combine a “trusted access management” platform with a security screening technology that is designed to be your first inner perimeter checkpoint. Trusted access management offers the ability to schedule, monitor and support the overall visitor experience.
Combining trusted access management with a well-designed physical screening system that is disconnected from the patient waiting area provides a level of trust for those waiting to see a doctor and creates a “sterile” area like airport terminals. Remember, trusted access management is not access control. The latter is designed for health care workers and proprietary staff while the former is designed for visitors.
Recent events have shown that health care facilities should be considering how they can update their security, and many other publicly accessible businesses have offered potential paths forward. The real question is: Will we get serious about health care security and safety? Only time will tell. Either way, health care security professionals must remain vigilant and proactive.
Forbes Technology Council is an invitation-only community for world-class CIOs, CTOs and technology executives. Do I qualify?
Scientists working on quantum physics in computing have been making friendly wagers for years. Adán Cabello, from the University of Seville (Spain) is heading to Rome soon to collect on a decade-old bet (a fancy dinner) with a friend about this year’s Nobel laureate in physics. But four years ago, Spanish researcher Miguel Navascués lost a wager because he didn’t believe a 50-qubit quantum computer could be built before 2050. It cost him €50 worth of hamburgers. Time has favored the optimists, but quantum physics continues to face the fundamental challenge of increasing computing capacity while reducing error rates. Alejandro González Tudela, a research scientist at the Spanish National Research Council’s (CSIC) Institute for Theoretical Physics in Murcia (Spain), is working on a new approach to the problem. He is combining the novel capabilities of metamaterials (structures with unusual attributes) with the quantum properties of light. His research program has been awarded $20 million in Leonardo grant funding from the BBVA Foundation since 2014.
In conventional computing, a bit is the basic unit of information. A bit is binary in that it can only have one of two values: 0 or 1. Combinations of bits can provide computers with extraordinary capabilities, but in quantum computing, the basic unit is the quantum bit, or qubit. It’s a quantum system that can have one of two states (0 and 1), or any superposition of these states. Superposition is the ability of a quantum system to be in multiple states at the same time until it is measured. The use of qubits allows trillions of bit combinations and therefore infinite computing possibilities. According to CSIC researcher Alberto Casas, “A quantum computer of 273 qubits will have more memory than there are atoms in the observable universe.”
The problem is that this quantum property of superposition is elusive, and can only remain stable for a short time. The slightest environmental change (temperature, electromagnetic noise or vibrations) degrades this property and makes it impossible for quantum computers to effectively perform practical, large-scale calculations. This effect is known as quantum decoherence.
A exact study published in Nature Physics by British, American and Chinese scientists used a 30-qubit programmable superconducting processor to demonstrate that “quantum information processing applications can be tuned to interact with each other while maintaining coherence for an unprecedented duration.” Error correction is also used, but this technique involved tackling one of the challenges of quantum computing – significantly increasing the number of qubits.
But González is taking an innovative approach to the problem. He is using metamaterials, structures with unusual attributes, to create quantum devices that can attain more qubits without increasing error rates. “The properties of these metamaterials,” said González, “are modulated below the wavelength needed to achieve rare responses like making a material invisible or focusing light beyond its limits.”
“The hypothesis,” said González, “is based on the fact that light has very good coherence [it easily preserves its quantum properties]. So the goal is to exploit the metamaterials’ very strong responses to light in order to Improve fidelity.”
The idea is to take advantage of light’s capacity for maintaining its quantum properties, since it interacts very little with the environment. However, the disadvantage of using light is that it’s difficult to manipulate, says González.
González decided to use metamaterials in his research after the exact development of a network of atoms separated by very short distances made it possible to exploit the quantum behavior of light. “By placing the atoms at very short distances, they behave collectively and can have very strong interactions with light,” said González. This will enable him to use metamaterials with more coherent quantum behaviors to overcome the difficulty of manipulating light particles. The ultimate goal is to develop computer hardware that solves the problem of scalability – a quantum computer with more qubits and fewer errors.
“It’s interesting,” said González, “to explore alternative paradigms. I’m not saying that my approach will result in the breakthrough that solves the problem and becomes the definitive platform. Right now, the best quantum computing implementations use ions trapped in superconducting circuits, but there is also quantum technology based on photons. Perhaps, the big leap forward will come from something that is completely off the radar, or from a combination of solutions.” Nevertheless, González strongly feels the need to blaze new trails with projects like the one that was awarded the Leonardo grant. Alberto Casas agrees. “The future of quantum computing is unknown, but it is undoubtedly worth exploring,” he writes in his recently published book, The Quantum Revolution.
The value of quantum computing is not to solve factorial calculations such as the ones used to test the systems. Nor is to figure out logistical puzzles like the best transportation routes between cities. Besides cryptography, González says the biggest aspirations for this technology are to enable secure communications and solve “certain physics and chemistry problems. These are multi-faceted issues with many interacting elements that are hard to solve using traditional computers.”
The pharmaceutical industry is one area where quantum computing can provide an “exponential advantage” in the development of personalized therapies, says González. “Maybe new problems will be identified that could benefit from quantum computing, or new applications that we haven’t yet imagined will be developed.”
Scientists from Trinity College in Dublin (Ireland) published a paper in the Journal of Physics Communications that describes the quantum behaviors of brains, consciousness and short-term memory processes. “Quantum brain processes could explain why we can still outperform supercomputers when it comes to unforeseen circumstances, decision-making and learning new things,” said co-author Christian Kerskens, a physicist with Trinity College’s Institute of Neurosciences. According to the study, “If advanced multidisciplinary approaches validate the results of this study, it will Improve the general understanding of how the brain works and lead to innovative technologies for building even more advanced quantum computers.”
Spain is an active competitor in the quantum race, not only in basic research but also in technological innovation. The Barcelona Supercomputing Center was selected by the European High Performance Computing Joint Undertaking (EuroHPC JU) to host and operate its first quantum computers. The new infrastructure will be installed and integrated with the MareNostrum 5 supercomputer, the most powerful computer in Spain and one of the most advanced in Europe. The QuantumSpain program will invest €12.5 million in this project, which is being equally co-financed by the European Union and Spain’s Secretariat for Digitization and Artificial Intelligence (SEDIA). “This new infrastructure, which will integrate quantum computing with MareNostrum 5, will enable us to advance multiple academic applications,” said a statement from Mateo Valero, director of the Barcelona Supercomputing Center. The Barcelona facility will connect to a network of supercomputers in Germany, Czechia, France, Italy and Poland to serve the growing demand for quantum computing resources and services from European industry, and to support research in areas such as health, climate change, logistics and energy use.
Scientists at the U.S. Department of Energy's (DOE) Brookhaven National Laboratory helped measure how unpaired electrons in atoms at one end of a molecule can drive chemical reactivity on the molecule's opposite side. As described in a paper recently published in the Journal of the American Chemical Society, this work, in collaboration with Princeton University, shows how molecules containing these so-called free radicals could be used in a whole new class of reactions.
"Most reactions involving free radicals take place at the site of the unpaired electron," explained Brookhaven Lab chemist Matthew Bird, one of the co-corresponding authors on the paper. The Princeton team had become experts in using free radicals for a range of synthetic applications, such as polymer upcycling. But they've wondered whether free radicals might influence reactivity on other parts of the molecule as well, by pulling electrons away from those more distant locations.
"Our measurements show that these radicals can exert powerful 'electron-withdrawing' effects that make other parts of the molecule more reactive," Bird said.
The Princeton team demonstrated how that long-distance pull can overcome energy barriers and bring together otherwise unreactive molecules, potentially leading to a new approach to organic molecule synthesis.
Combining capabilities
The research relied on the combined resources of a Princeton-led DOE Energy Frontier Research Center (EFRC) focused on Bio-Inspired Light Escalated Chemistry (BioLEC). The collaboration brings together leading synthetic chemists with groups having advanced spectroscopic techniques for studying reactions. Its funding was recently renewed for another four years.
Robert Knowles, who led Princeton's role in this research, said, "This project is an example of how BioLEC's combined expertise enabled the team to quantify an important physical property of these radical species, that in turn allowed us to design the resulting synthetic methodology."
The Brookhaven team's major contribution is a technique called pulse radiolysis -- available only at Brookhaven and one other location in the U.S.
"We use the Laser Electron Accelerator Facility (LEAF) -- part of the Accelerator Center for Energy Research (ACER) in Brookhaven's Chemistry Division -- to generate intense high-energy electron pulses," Bird explained. "These pulses allow us to add or subtract electrons from molecules to make reactive species that might be difficult to make using other techniques, including short-lived reaction intermediates. With this technique, we can step into one part of a reaction and monitor what happens."
For the current study, the team used pulse radiolysis to generate molecules with oxygen-centered radicals, and then measured the "electron-withdrawing" effects on the other side of the molecule. They measured the electron pull by tracking how much the oxygen at the opposite side attracts protons, positively charged ions sloshing around in solution. The stronger the pull from the radical, the more acidic the solution has to be for protons to bind to the molecule, Bird explained.
The Brookhaven scientists found the acidity had to be high to enable proton capture, meaning the oxygen radical was a very strong electron withdrawing group. That was good news for the Princeton team. They then demonstrated that it's possible to exploit the "electron-withdrawing" effect of oxygen radicals by making parts of molecules that are generally inert more chemically reactive.
"The oxygen radical induces a transient 'polarity reversal' within the molecule -- causing electrons that normally want to remain on that distant side to move toward the radical to make the 'far' side more reactive," Bird explained.
These findings enabled a novel substitution reaction on phenol based starting materials to make more complex phenol products.
"This is a great example of how our technique of pulse radiolysis can be applied to cutting-edge science problems," said Bird. "We were delighted to host an excellent graduate student, Nick Shin, from the Knowles group for this collaboration. We look forward to more collaborative projects in this second phase of BioLEC and seeing what new problems we can explore using pulse radiolysis."
Brookhaven Lab's role in this work and the EFRC at Princeton were funded by the DOE Office of Science (BES). Princeton received additional funding for the synthesis work from the National Institutes of Health.
Fictionary co-founder and CEO Kristina Stanley has worked in a wide variety of different jobs, from manager of broadband planning at Nortel to the director of employee, safety, and guest services for an Eastern British Columbia ski resort, to author of mystery novels.
But one of Stanley’s most difficult jobs was figuring out how to edit her own manuscripts while writing The Stone Mountain Mystery Series. As she told BetaKit in an interview, “it’s really, really difficult to edit a book from a story level. You’ve got thousands and thousands of elements that you have to keep track of and make them work together.”
“We’re trying to help the average person who doesn’t have an ‘in’ in the publishing industry get a really good book out there, get an agent, or get a publisher.”
-Kristina Stanley, Fictionary
Initially, Stanley tackled this problem using a combination of Microsoft Excel spreadsheets and graphs. But she soon realized that other authors likely faced the exact same issue, and set out to build a better way by combining her tech and writing background.
Today, Stanley’s software startup Fictionary aims to offer an alternative. Amid a wide field of solutions that help writers and editors with specific parts of the process, like spelling, grammar, style, structure, and publishing, Fictionary hones in on perhaps the most important and challenging part: producing a good story.
Fuelled by $1.8 million CAD in seed funding, Fictionary aims to help writers and editors around the world produce quality stories more quickly and affordably. With this capital, the Inverary, Ontario startup, based just north of Kingston, plans to move into non-fiction and start selling to other publishers and agencies to expand its community of users.
The startup’s all-equity round, which closed in September, was co-led by StandUp Ventures and BDC Capital’s Thrive Venture Fund, with support from The51 and a group of angels that includes Women’s Equity Lab general partner Sally Morris. For newly launched Thrive, Fictionary marks the fund’s third investment to date, after investing in Acerta and Private AI.
Stanley founded Fictionary in 2016 alongside her husband, Mathew (COO), who also previously worked at Nortel and has a background in tech, and her brother, Michael Conn, Fictionary’s former CTO, who has since left the company.
Initially, Fictionary focused solely on writers, before expanding to meet demand for a similar offering from editors. Today, Fictionary offers three subscription software products for writers and editors that range in price from $19 to $49 monthly, sells online courses, and provides a community for writers and editors to connect.
Fictionary’s software helps writers visualize their story arc by analyzing key story elements with artificial intelligence (AI) and gauging how their manuscript compares to fundamental storytelling components.
“We’re trying to help the average person who doesn’t have an ‘in’ in the publishing industry get a really good book out there, get an agent, or get a publisher,” said Stanley.
On the editor side of the equation, the company claims its offering enables editors to provide better, deeper story edits in less time, increasing the quality and profitability of editors’ services.
The writing and editing software space features a ton of players, from Grammarly to Scrivener, Novel Factory, and Canada’s Wattpad. According to Stanley, Fictionary is unique within the sectors in terms of its focus on storytelling elements and its use of AI. “We’re it right now as far as, there’s an automated way to do this, and have software for it,” said Stanley.
“While there are other platforms endeavoring to address this gap in the market, there doesn’t appear to be a single player who is able to look at the writing and editing process in a comprehensive and meaningful way, which puts Fictionary at a sizeable advantage to lead the charge and expand into new markets and segments,” Michelle Scarborough, managing partner of BDC Capital’s Thrive Venture Fund, told BetaKit.
RELATED: StandUp Ventures reveals second fund dedicated to women-led startups with $30 million first close
Fictionary previously secured $100,000 in grant funding from Creative BC and raised $245,000 in pre-seed funding in 2019 from a group of angels that included Shopify co-founder Scott Lake, Stephanie Andrew of Women’s Equity Lab, and FirstEditing founder and CEO JoEllen Taylor.
According to Stanley, following that pre-seed round, Fictionary reached breakeven cash flow and had to decide whether to keep going on its current track or set its sights higher.
Following some discussions with StandUp Ventures, Fictionary decided to embark on a new chapter and raise more venture capital to tackle the opportunity it sees in this space amid the rise of self-publishing. “We have a great product, we’ve got product-market fit, we’ve got a market, so let’s just go for it,” said Stanley.
“The love for the product Fictionary users articulate so regularly is rare, and indicative of the power and impact the tool brings to its customers,” said StandUp Ventures senior associate Lucas Perlman, who is joining Fictionary’s board as part of the round. “The self-publishing world has exploded, and we believe Fictionary is poised to become a de-facto part of the story writing toolkit for writers and editors around the globe.”
RELATED: Wattpad’s new leader is focused on creator value
For her part, Scarborough said the Thrive Venture Fund sees “a sizeable opportunity [for Fictionary] in the fast-growing creator economy space—a market with many dimensions—within writing and editing, screenwriting, non-fiction, and beyond.”
To date, Fictionary has focused entirely on fiction but Stanley said the startup’s roadmap includes moving into non-fiction, where the CEO sees plenty of potential to apply its tech to helping people tell their own life stories. Fictionary also sees an opportunity to help agencies and publishers clear the slush pile of submitted manuscripts.
As it looks to build out its own community of writers and editors, Fictionary follows in the footsteps of Wattpad, which parlayed its vibrant self-publishing community of writers and readers—and the content produced by them—into a $754 million CAD acquisition last year.
After discussions with StandUp, Fictionary decided to embark on a new chapter.
“Wattpad is very inspirational for us,” said Stanley. “They are different in the sense that people write their stories in the community, where we help writers take those stories and turn them into powerful stories readers love. Their community is a great lead-in to Fictionary for writers needing to edit their stories.”
As the startup charts its growth strategy amid an uncertain economic environment, Stanley is confident that Fictionary is well-positioned to grow during this period, noting that people tend to write more when they are stressed. Back when COVID-19 first hit and everyone was cooped up, the CEO said people begin writing more, and demand for Fictionary rose. Heading into what could be a deep downturn, Stanley believes Fictionary is in a good spot given that it offers a tool to help people do their passion without spending a lot of money.
What Perlman finds most exciting is the appreciation Fictionary’s customers have for the startup’s product, noting that writers “pour countless hours into their stories and writing books is an emotional and very personal thing to take on.”
“Fictionary has removed a major hurdle that stopped these creators from bringing their stories into the world,” Perlman told BetaKit. “The impact of that really comes through when you speak to their customers and see feedback from their community.”
Feature image courtesy Fictionary.