Wearable devices are becoming more common as a result of technological advancements and new capabilities. These devices may exist in the form of clothing, watches or eyewear and can serve as standalone gadgets or may link to another device such as a smartphone, tablet or laptop. This Wearable Device Policy, from TechRepublic Premium, is designed ...
When a Hackaday article proclaims that its subject is a book you should read, you might imagine that we would be talking of a seminal text known only by its authors’ names. Horowitz and Hill, perhaps, or maybe Kernigan and Ritchie. The kind of book from which you learn your craft, and to which you continuously return to as a work of reference. Those books that you don’t sell on at the end of your university career.
So you might find it a little unexpected then that our subject here is a children’s book. Making A Transistor Radio, by [George Dobbs, G3RJV] is one of the huge series of books published in the UK under the Ladybird imprint that were a staple of British childhoods for a large part of the twentieth century. These slim volumes in a distinctive 7″ by 4.5″ (180 x 115 mm) hard cover format were published on a huge range of subjects, and contained well written and informative text paired with illustrations that often came from the foremost artists of the day. This one was published at the start of the 1970s when Ladybird books were in their heyday, and has the simple objective of taking the reader through the construction of a simple three transistor radio. It’s a book you must read not because it is a seminal work in the vein of Horrowitz and Hill, but because it is the book that will have provided the first introduction to electronics for many people whose path took them from this humble start into taking the subject up as a career. Including me as it happens, I received my copy in about 1979, and never looked back.
When you open the book, the first thing you see sets the tone, for there is a guide to soldering on the inside of the front cover. This is an optional construction method, but it is presented in a style that does not talk down to the reader. You are here to learn about electronics, not to be reminded that you are a child.
Past the title page, and the you are introduced to radio shown a block diagram of a receiver, and then simple circuitry with a torch (flashlight) battery and bulb as a first example. You are then launched into your first radio circuitry, first with a tuned circuit and then with the addition of a germanium point-contact diode and earpiece, a simple crystal set. One of the first illustrations shows a young boy wearing a shirt and tie, typical of the slightly idealised world of children’s’ books of the era. This was the 1970s, just how many boys would have been dressed like that, really!
Despite the introduction to soldering inside the cover, the signature construction method used in the book is the use of woodscrews and screwcups on a wooden baseboard. The reader is introduced to these, and the tools the might have to master, before being shown the measurements for the board. With this complete, we are ready for our first construction, the crystal set with its coil wound on a ferrite rod.
It is easy to believe these days that children are shielded from anything that might be remotely practical, for fear that they might hurt themselves. Fortunately the ethos of this book has its roots in a far more can-do era, and an action such as fracturing a ferrite rod to create the 3″ (75mm) length required is taken in its stride. Again, the reader is not talked down to, being introduced to all the useful things you need to know if you are to maintain an interest in radio. Few other children’s books deal with the Topic of standard wire gauges.
Once constructed, the crystal set and its associated aerial (antenna) and earth would have given the 1970s child an instant result, as over most of the more populous parts of the British mainland they would have easily received the strongest AM signal, BBC Radio 2. A crystal set is hardly selective, so it’s quite likely that no matter where it was tuned it would still pick up Radio 2. Still, the sense of achievement at having pulled a signal out of thin air would have been very strong. As an exercise the book takes a brief diversion into home-made radios as created by WW2 prisoners of war with a detector made from a piece of coke.
The book then adds amplification to the crystal set in a series of stages which culminate in driving a small loudspeaker. This section is more than simply the stages of amplifier construction though, because while it takes the reader through those steps it is also a very basic primer on electronic components and transistor circuits. The amplifier is a very old-fashioned, single-ended design with an output transformer. The transistors in question are the now-archaic germanium PNP devices that had probably already been superseded by the early 1970s, but the principles of biasing and transistor circuitry are universal to all bipolar circuits. And the introduction to resistors with the resistor colour code is something that stays with a young future electronic engineer throughout their career.
Finally, the reader is shown a regenerative front end for their radio that replaces the crystal set. The operation of regeneration is explained, new components are introduced, and the construction is laid out. There follows a guide to using the radio, and finally a page on finishing its case with a mounting for both speaker and battery. The final receiver might not have been as good as its commercial superhetrodyne equivalent, but it would have provided acceptable performance to receive most strong AM stations.
This book has only 50 pages, and of those, half are composed of pictures and diagrams. Within this meagre canvas the author manages to not only guide the reader through the construction of a working radio receiver, but to also lay the seeds of an understanding of solid state electronics. Topics such as the resistor colour code or transistor biasing are part of the early syllabus of a first-year electronic engineering course, yet here we find them presented in a children’s book in a format that a younger reader would understand. You are reading this review because my career as an electronic engineer has its roots in this book, it would be interesting to know how many other readers will tell the same story.
Making A Transistor Radio was published in 1972, and appeared as a second edition in Ladybird’s Learnabout series at the end of the decade. Some of the devices it uses may well have been out of production by the end of its print run; even in 1979 it proved difficult to source an OC44 and we had to use an AF117 instead. The book is now long out of print, so your best bet if you want to read it yourself is to do a Google search on its title for a PDF, or to scour second-hand booksellers. There is a copy on the Internet Archive, though it has some missing pages. The book’s author, [George Dobbs, G3RJV], continues to be a prolific writer and source of radio projects. As founder of the G-QRP Club, he has been very active in furthering the cause of low-power amateur radio.
It would be interesting to see how easily a contemporary version of the book could be created, with silicon transistors, Schottky signal diodes, and a polyvaricon to replace the Jackson Dilecon variable capacitor. Or perhaps an AM radio is no longer enough to capture the imagination of a child. Ladybird stopped producing children’s books in this format in 1999, though they have recently re-emerged in a humorous form aimed at adults.
If you were introduced to electronics by this book, let us know in the comments. Do you still have your radio? If there are any other similar books that made the same mark for non-Brits, we’d love to hear about them.
Any business hoping to succeed in today’s competitive landscape understands the need to put digital transformation at the top of its priority list.
Digital technology now infiltrates almost every aspect of conducting business, from how customers interact with organizations, how employees do their jobs and how the internal processes work, and the global pandemic and shift towards remote working only accelerated this trend. Put simply, those which do not adapt to the digital world are highly unlikely to survive.
“Both customers and employees expect to have a contextual experience throughout their journey with the company,” says Prabhu Karunakaran [PK], founder and CEO of digital transformation firm Exterprise. “To achieve such seamless experiences in a hybrid work environment, organizations are creating more technology than ever before, and are becoming digital solution companies in the process. They are quickly realizing that they need to unite their formerly siloed information technology, business, HR, legal and marketing departments into fast-moving, cross-disciplinary digital teams.”
But for established businesses, this often means transforming from rigid legacy systems to flexible digital platforms, which can prove challenging even for Fortune 500 companies. A lack of historic investment in technology has been compounded by budget cuts during the pandemic, meaning many still rely on antiquated hardware and software installed decades previously.
This is not the only factor holding businesses back. Other common issues include organizational silos where decision-making can be split across functions; a lack of digital skills; a culture which is resistant to change; and a lack of clear strategy and communication across departments.
For many businesses, the answer is to turn to one unified platform that can bring productivity improvements through digital transformation which can help employees, customers, and assets. “Taking good care of these three core imperatives is critical to profitable revenues and organizational growth,” says PK. “Productive and happy employees will deliver enhanced customer experiences, resulting in more revenues. Efficient monitoring of assets including hardware, software and enterprise assets will create operational excellence.”
One such offering is ServiceNow’s NOW Platform, which deploys enterprise-scale solutions to organizations in several industries, improving efficiency for employees, customers, and assets. Its IT Service Management solution, for instance, automates core service processes, and makes it easier for those working in IT to resolve issues faster, while its IT Operations Management and Asset Management package proactively monitors the performance of assets, helping to prevent outages in the first place.
The Customer Service Management element ensures agents have the information they need to quickly resolve issues, including proactively identifying potential problems before they become apparent. Businesses can also benefit from automated responses to common customer questions, increasing the efficiency of employees. Staff working in the field can benefit from the same functionality through the Field Service Management solution.
From an employee perspective, businesses can deliver a streamlined service experience with intelligent workflows, starting with efficiently onboarding new hires. Employees themselves can find the information they need to self-service their own HR needs, while the Workplace Service Delivery solution helps the day-to-day work run more smoothly, boosting productivity. A recent IDC report suggests agile organizations retain employees at a 34% higher rate than competitors, helping them meet the demand for skills in the current war for talent.
With teams in the US, India, and Central America, Exterprise is a certified pure-play ServiceNow implementation partner, with skilled resources operating in different time zones, and clients in sectors including financial services, insurance, healthcare, telecom, and government. Exterprise helps businesses on their transformation journey, starting by determining that the organization is fully committed to the journey, and helping to provide advisory and consultative services to demonstrate the business case and identify any challenges that will need to be overcome.
“Partners are critical to ServiceNow’s growth strategy, on our journey to $16B in revenue and beyond,” said David Parsons, Senior Vice President of Global Alliances and Partner Ecosystem at ServiceNow. “Enabling great customer and employee experiences to faster time to value, we remain committed to engaging our partner ecosystem to innovate and co-create end-to-end solutions that solve today’s most pressing business challenges.”
Exterprise works closely with clients to launch the digital transformation journey, identifying automation and integration opportunities and creating a strategic roadmap for the different departments, as well as training staff on how to use the platform.
Exterprise helps in identifying key metrics around digital standards – including mean time to resolution (MTTR) for issues, customer satisfaction (CSAT), net promoter score (NPS) and first call resolution (FCR) for customer service. This ensures organizations realize the full value, both financially and in other, less tangible ways, of their investment. Exterprise is SOC2, ISO 27001 and HIPAA certified, which can give customers an added layer of reassurance.
Ongoing reviews ensure businesses benefit from and understand any new functionality to keep enhancing the employee and customer experience, with ServiceNow introducing two major version upgrades each year. This ensures customers remain at the forefront of digital transformation, turning them from digital laggards to leaders.
One example of a business that has benefited from running several business functions on the ServiceNow platform is pipeline services and equipment company T.D. Williamson. “Exterprise has proved to be an effective partner in not only implementing ServiceNow, but also in providing consultative and advisory support to help us automate core business processes,” says Greg Rice, Director, Global Applications.
PK believes those who take the step to become a digital business can expect to benefit in several ways, including enhanced customer and employee experience, lower employee attrition rates, productivity benefits through process automation, and ultimately increased – and sustainable – revenue and profit.
“The time to act is now,” concludes PK. “Those who do so can put in place the foundations that will deliver ongoing success in years to come, increasing efficiency and ensuring they have satisfied customers and employees. That’s the recipe for success.”
To find out more about how Exterprise could help your business, visit exterprise.us/servicenow/
To contact us, click here or email us at email@example.com
Knowledge is an event to bring the ServiceNow community together and experience the power of the workflow. Connect with digital leaders, create new possibilities and experiences for your customers and employees, and change how your business responds to a rapidly evolving workplace.
Whatever your business is facing, let’s workflow it.
ServiceNow reported quarterly revenue above Wall Street estimates on Wednesday, boosted by a growing demand for its artificial intelligence solutions.
The rising demand for workflow automation and the company's continued efforts to expand its portfolio with new generative AI solutions helped drive growth.
"ServiceNow is already seeing our own significant productivity increases with the generative AI solutions we’re releasing to the market, which will rapidly accelerate breakthrough innovation for our customers," said CEO Bill McDermott.
The company forecast third-quarter subscription revenue in the range of $2.19 billion to $2.20 billion. Analysts on average expected $2.15 billion, according to Refinitiv data.
Separately on Wednesday, ServiceNow also announced a program along with Nvidia and Accenture, called AI Lighthouse, designed to help the development and adoption of generative AI at enterprises.
ServiceNow's overall subscription revenue for the quarter was $2.15 billion, higher than analysts' average estimate of $2.13 billion, according to Refinitiv data.
On an adjusted basis, the company earned $2.37 per share during the quarter, compared with a profit estimate of $2.05 per share, according to Refinitiv.
ServiceNow's shares were down more than 3 per cent after the bell.
The modern world of consumer tech wouldn't exist as we know it if not for the near-ubiquitous connectivity that Wi-Fi internet provides. It serves as the wireless link bridging our mobile devices and smart home appliances, enabling our streaming entertainment and connecting us to the global internet.
In his new book, Beyond Everywhere: How Wi-Fi Became the World’s Most Beloved Technology, Greg Ennis, who co-authored the proposal that became the technical basis for WiFi technology before founding the Wi-Fi Alliance and serving as its VP of Technology for a quarter century, guides readers on the fascinating (and sometimes frustrating) genesis of this now everyday technology. In the excerpt below, Ennis recounts the harrowing final days of pitching and presentations before ultimately convincing the IEEE 802.11 Wireless LAN standards committee to adopt their candidate protocol as well as examine the outside influence that Bob Metcalf — inventor of both Ethernet, the standard, and 3Com, the tech company — had on Wi-Fi's eventual emergence.
Excerpted from Beyond Everywhere: How Wi-Fi Became the World’s Most Beloved Technology (c) 2023 by Greg Ennis. Published by Post Hill Press. Used with permission.
With our DFWMAC foundation now chosen, the work for the IEEE committee calmed down into a deliberate process for approving the real text language for the standard. There were still some big gaps that needed to be filled in—most important being an encryption scheme—but the committee settled into a routine of developing draft versions of the MAC sections of the ultimate standard document. At the January 1994 meeting in San Jose, I was selected to be Technical Editor of the entire (MAC+PHY) standard along with Bob O’Hara, and the two of us would continue to serve as editors through the first publication of the final standard in 1997.
The first draft of the MAC sections was basically our DFWMAC specification reformatted into the IEEE template. The development of the text was a well-established process within IEEE standards committees: as Bob and I would complete a draft, the members of the committee would submit comments, and at the subsequent meeting, there would be debates and decisions on improvements to the text. There were changes made to the packet formats, and detailed algorithmic language was developed for the operations of the protocol, but by and large, the conceptual framework of DFWMAC was left intact. In fact, nearly thirty years after DFWMAC was first proposed, its core ideas continue to form the foundation for Wi-Fi.
While this text-finalization process was going on, the technology refused to stand still. Advances in both radio communications theory and circuit design meant that higher speeds might be possible beyond the 2-megabit maximum in the draft standard. Many companies within the industry were starting to look at higher speeds even before the original standard was finally formally adopted in 1997. Achieving a speed greater than 10 megabits — comparable to standard Ethernet — had become the wireless LAN industry’s Holy Grail. The challenge was to do this while staying within the FCC’s requirements — something that would require both science and art.
Faster is always better, of course, but what was driving the push for 10 megabits? What wireless applications were really going to require 10-megabit speeds? The dominant applications for wireless LANs in the 1990s were the so-called “verticals” — for example, Symbol’s installations that involved handheld barcode scanners for inventory management. Such specialized wireless networks were installed by vertically integrated system providers offering a complete service package, including hardware, software, applications, training, and support, hence the “vertical” nomenclature. While 10-megabit speeds would be nice for these vertical applications, it probably wasn’t necessary, and if the cost were to go up, such speeds wouldn’t be justifiable. So instead, it would be the so-called “horizontal” market — wireless connectivity for general purpose computers — that drove this need for speed. In particular, the predominantly Ethernet-based office automation market, with PCs connected to shared printers and file servers, was seen as requiring faster speeds than the IEEE standard’s 2 megabits.
Bob Metcalfe is famous in the computer industry for three things: Ethernet, Metcalfe’s Law, and 3Com. He co-invented Ethernet; that’s simple enough and would be grounds for his fame all by itself. Metcalfe’s Law— which, of course, is not actually a law of physics but nonetheless seems to have real explanatory power— states that the value of a communication technology is proportional to the square of the number of connected devices. This intuitively plausible “law” explains the viral snowball effect that can result from the growing popularity of a network technology. But it would be Metcalfe’s 3Com that enters into our Wi-Fi story at this moment.
Metcalfe invented Ethernet while working at PARC, the Xerox Palo Alto Research Center. PARC played a key role in developing many of the most important technologies of today, including window-based graphic computer interfaces and laser printing, in addition to Ethernet. But Xerox is famous for “Fumbling the Future,” also the title of a 1999 book documenting how “Xerox invented, then ignored, the first personal computer,” since the innovations developed at PARC generally ended up being commercialized not by Xerox but by Apple and others. Not surprisingly, Metcalfe decided he needed a different company to take his Ethernet invention to the market, and in 1979, he formed 3Com with some partners.
This was the same year I joined Sytek, which had been founded just a couple of months prior. Like 3Com, Sytek focused on LAN products, although based on broadband cable television technology in contrast to 3Com’s Ethernet. But whereas Sytek concentrated on hardware, 3Com decided to also develop their own software supporting new LAN-based office applications for shared PC access to data files and printers. With these software products in combination with their Ethernet technology, 3Com became a dominant player in the booming office automation market during the nineties that followed the introduction of personal computers. Bob Metcalfe was famously skeptical about wireless LANs. In the August 16, 1993, issue of InfoWorld, he wrote up his opinion in a piece entitled “Wireless computing will flop — permanently”:
This isn’t to say there won’t be any wireless computing. Wireless mobile computers will eventually be as common as today’s pipeless mobile bathrooms. Porta-potties are found on planes and boats, on construction sites, at rock concerts, and other places where it is very inconvenient to run pipes. But bathrooms are still predominantly plumbed. For more or less the same reasons, computers will stay wired.
Was his comparison of wireless to porta-potties just sour grapes? After all, this is coming from the inventor of Ethernet, the very archetype of a wired network. In any event, we were fortunate that Metcalfe was no longer involved with 3Com management in 1996 — because 3Com now enters our story as a major catalyst for the development of Wi-Fi.
3Com’s strategy for wireless LANs was naturally a subject of great interest, as whatever direction they decided to take was going to be a significant factor in the market. As the premier Ethernet company with a customer base that was accustomed to 10-megabit speeds, it was clear that they wouldn’t take any steps unless the wireless speeds increased beyond the 2 megabits of the draft IEEE standard. But might they decide to stay out of wireless completely, like Bob Metcalfe counselled, to focus on their strong market position with wired Ethernet? And if they did decide to join the wireless world, would they develop their own technology to accomplish this? Or would they partner with an existing wireless developer? The task of navigating 3Com through this twisted path would fall to a disarmingly boyish business development whiz named Jeff Abramowitz, who approached me one afternoon quite unexpectedly.
Jeff tapped me on the shoulder at an IEEE meeting. “Hey, Greg, can I talk with you for a sec?” he whispered, and we both snuck quietly out of the meeting room. “Just wondering if you have any time available to take on a new project.” He didn’t even give me a chance to respond before continuing with a smile: “10 megabits. Wireless Ethernet.” The idea of working with the foremost Ethernet company on a high-speed version of 802.11 obviously enticed me, and I quickly said, “Let’s get together next week.”
He told me that they had already made some progress towards an internally developed implementation, but that in his opinion, it was more promising for them to partner with one of the major active players. 3Com wanted to procure a complete system of wireless LAN products that they could offer to their customer base, comprising access points and plug-in adapters (“client devices”) for both laptops and desktops. There would need to be a Request for Proposal developed, which would, of course, include both technical and business requirements, and Jeff looked to me to help formulate the technical requirements. The potential partners included Symbol, Lucent, Aironet, InTalk, and Harris Semiconductor, among others, and our first task was to develop this RFP to send out to these companies.
Symbol should need no introduction, having been my client and having played a major role in the development of the DFWMAC protocol that was selected as the foundation for the 802.11 standard. Lucent may sound like a new player, but in fact, this is simply our NCR Dutch colleagues from Utrecht — including Wim, Cees, Vic, and Bruce — under a new corporate name, NCR having been first bought by AT&T and then spun off into Lucent. Aironet is similarly an old friend under a new name — back at the start of our story, we saw that the very first wireless LAN product approved by the FCC was from a Canadian company called Telesystems, which eventually was merged into Telxon, with Aironet then being the result of a 1994 spinoff focusing on the wireless LAN business. And in another sign of the small-world nature of the wireless LAN industry at this time, my DFWMAC co-author, Phil Belanger, had moved from Xircom to Aironet in early 1996.
The two companies here who are truly new to our story are InTalk and Harris. InTalk was a small startup founded in 1996 in Cambridge, England (and then subsequently acquired by Nokia), whose engineers were significant contributors to the development of the final text within the 802.11 standard. Harris Corporation was a major defense contractor headquartered in Melbourne, Florida, who leveraged their radio system design experience into an early wireless LAN chip development project. Since they were focused on being a chip supplier rather than an equipment manufacturer, we didn’t expect them to submit their own proposal, but it was likely that other responders would incorporate their chips, so we certainly viewed them as an important player.
Over the first couple of months in 1997, Jeff and I worked up a Request for Proposal for 3Com to send out, along with a 3Com engineer named David Fisher, and by March we were able to provide the final version to various candidate partners. Given 3Com’s position in the general LAN market, the level of interest was high, and we indeed got a good set of proposals back from the companies we expected, including Symbol, Lucent, InTalk, and Aironet. These companies, along with Harris, quickly became our focus, and we began a process of intense engagement with all of them over the next several months, building relationships in the process that a year later would ultimately lead to the formation of the Wi-Fi Alliance.
Bob Metcalfe’s wireless skepticism had been soundly rejected by the very company he founded, with 3Com instead adopting the mantle of wireless evangelism. And Wireless Ethernet, soon to be christened Wi-Fi, was destined to outshine its wired LAN ancestor.
Explore the impact and implementation of FRTB within financial institutions from the perspectives of model risk management, capital requirements and data management.
This interactive virtual event offers participants a technical and detailed understanding of the continuing journey of FRTB implementation and its associated challenges. Led by subject matter expert and faculty member Thomas Obitz, this event will enable participants to connect and discuss best practice approaches on how to manage FRTB within their organisation to enhance capability and efficiency.
Dedicated sessions exploring key components and considerations of the SA and the sensitivities-based approach, emerging risk factors from the interbank offered rates transition impact, the IMA and the trading book/ banking book boundary under FRTB will support delegates in their ability to apply FRTB principles in their own institutions.
Flexible pricing options:
Early-bird rate: book in advance and save $200
3-for-2 group rate: book three delegates for the price of two and save more than $2,000
Season tickets: book a team of 10 or more and save up to 50%
INTERNATIONAL ATOMIC ENERGY AGENCY, Advancing Implementation of Decommissioning and Environmental Remediation Programmes, IAEA Nuclear Energy Series No. NW-T-1.10, IAEA, Vienna (2016)
*use BibTeX for Zotero