Download duplicate Questions of M2150-728 exam that showed up in actual test today

There are many audits of killexams.com present on the web that will cause you to feel that you have tracked down the specific wellspring of legitimate IBM Information Management Data Security & Privacy Sales Mastery v1 questions and answers. Practically every one of the up-and-comers finishes their tests thinking carefully that contains actual test questions and replies. Retaining and rehearsing M2150-728 practice questions is adequate to pass with good grades.

Exam Code: M2150-728 Practice test 2022 by Killexams.com team
IBM Information Management Data Security & Privacy Sales Mastery v1
IBM Information test success
Killexams : IBM Information test success - BingNews https://killexams.com/pass4sure/exam-detail/M2150-728 Search results Killexams : IBM Information test success - BingNews https://killexams.com/pass4sure/exam-detail/M2150-728 https://killexams.com/exam_list/IBM Killexams : Best IT Certifications for 2019

Getting certified is a surefire way to advance your career in the IT industry. Whether you work for an enterprise, a small business, the government, a healthcare organization or any other place that employs IT professionals, your best bet for career advancement is to validate your skills and knowledge through a carefully chosen combination of certifications.

Many in the IT industry will tell you that certifications are vital for moving forward, but what exactly can a certification in the IT field do for you? CompTIA, a major technology company that offers its own list of certifications, breaks down some of the major reasons for pursuing certifications:

  • Assess the skills and knowledge of a candidate
  • Save time and resources to evaluate candidates
  • Confirm knowledge gains
  • Support professional development
  • Match the need of professional training

But certifications can get expensive. Factor in study materials, training and classes, test fees and the time that you devote to the whole experience; it all adds up. If you consider certifications as an investment in your career and your future, then wouldn’t you want to work to obtain those that will benefit you the most?

The right IT certification for career success

Below you’ll find certification guides that identify the best five credentials in various areas of IT, including security, storage, project management, cloud computing and computer forensics.

Best big data certifications 

The growing field of big data, analytics and business intelligence is all the rage these days, and the number of certifications is ticking up accordingly. What’s more, IT professionals with big data and related certs are growing in demand. Read more about the best big data certifications

Best business continuity and disaster recovery certifications

Business continuity and disaster recovery certifications are seeing a healthy uptrend as new cloud-based tools proliferate the market. While BCP and DRP have always been important, they’re becoming more critical, and IT certifications are following suit. Check out our best picks for business continuity and disaster recovery certifications

Best cloud IT certifications

IT pros with solid cloud computing skills continue to be in high demand as more companies adopt cloud technologies. With new cloud credentials coming out each year, the choices are multiplying. Here’s an updated list of the best cloud certifications.

Best computer forensics certifications

As hacking and computer crime continues to grow, the demand for qualified computer forensics professionals is also on the rise. From over two dozen computer forensics certifications, we list the five best options for 2019.

Best computer hardware certifications 

Computer hardware certs are a great entry point for IT pros interested in becoming computer technicians. These top credentials help you demonstrate your knowledge and skills in maintaining computers, servers, peripherals and more. Read more about the best computer hardware certifications

Best computer networking certifications

IT pros skilled in the many areas of networking remain in high demand in today’s job market. If you’re serious about a career in networking, consider one or more of these certs to set yourself apart from the competition. Read more about the best computer networking certifications.

Best data center certifications 

With emerging software defined data center (SDDC) technologies, many of today’s data center certifications are requiring additional skills, including virtualization and cloud computing. Here are the best options for data center certifications for 2019.

Best database certifications 

Knowledgeable and savvy database professionals are always in demand. Here are the best certifications for DBAs, database developers, data analysis and architects, business intelligence and data warehousing specialists, and other data professionals. Check out our best picks among database certifications

Best enterprise architect certifications 

The demand for highly skilled IT professionals who understand not only technology but the business and enterprise architecture as well is on the rise. This is where enterprise IT architect credentials can help you get to the pinnacle of the IT certification pyramid. Read more about the best enterprise architect certifications

Best healthcare IT certifications 

As global healthcare spending continues to grow, more IT pros with an understanding of healthcare’s unique technology and compliance requirements are needed. Today, there are various healthcare IT certifications to choose from – here are the top five.

Best help desk certifications 

Help desk and tech support positions remain popular points of entry for IT pros. To start or advance your tech support career and make yourself stand out, validate your knowledge and skills through one of these leading help desk certifications.

Best information security certifications

Infosec professionals serious about advancing their careers should consider these top security certifications. From entry-level choices to advanced credentials, these certifications will get you on your way to becoming a leader in the field of information security. Check out our top picks in infosec certifications

Best IT governance certifications 

Organizations that must meet the rigors of compliance are always in need of experienced IT pros who can see the big picture and understand risks involved with technology. This means certified IT governance professionals are in high demand for 2019. Read more about the best IT governance certifications

Best IT trainer certifications 

Becoming an IT trainer is not an easy path, but the destination is full of opportunities. IT instructors need to stay on top of the technologies they’re teaching, know how to effectively share their knowledge, and engage with their students. Read more about the best IT trainer certifications

Best Linux certifications

Linux continues to have a strong hold on the enterprise, especially in the web server and supercomputer space, and IT pros interested in validating their Linux skills have a lot of certification options. Here are the best five Linux certifications for 2019.

Best mobile app development certifications 

Developers and IT pros interested in mobility and app development can get their resumes noticed with one or more mobile app development certifications. With options such as Microsoft, Android and Java certifications, these credentials are a smart move. Check out the best picks for mobile app development certifications

Best mobility certifications

Mobile technologies are experiencing unprecedented growth, with no slowdown in sight. Mobility is truly taking IT by storm, forcing management to rethink everything from big-picture infrastructure design to network integration. Mobile certifications are following suit. Read more about the best certifications for mobility

Best programming certifications 

When considering computer programming certifications, make sure to evaluate not just the programming languages but also the development platforms you’ll be working with. IT pros interested in computer programming have no shortage of certification options. Check out our best picks for programming certifications

Best project management certifications

Project management certs continue to rate near the top of IT certification lists. Whether you’re interested in becoming a project manager or want to add project management to the list of your skills, these certifications can help you achieve your goals. Read more about the best project management certifications to consider. 

Best storage certifications

Skilled storage professionals continue to be in high demand in today’s workplace. Those looking to prove they have the chops have over 70 different storage certifications to choose from. We narrow down this list to the top five choices.

Best system administrator certifications 

When it comes to managing modern servers, there’s a long list of tools and technologies SysAdmins need to master (and an equally long list of certs to validate your knowledge and skills). Here are the best system administrator certifications for 2019.

Best telecommunications certifications

If you’re interested in telecommunications, you have several certifications to choose from, including ones that cover telecom networking and various kinds of streaming voice and video platforms for unified communications. Read more about the best telecommunications certifications

Best unified communications certifications 

Unified communications platforms bring multiple applications together, such as email, instant messaging, IP telephony, audio and video conferencing. In today’s globally dispersed world, UC certifications are growing in demand. Check out our best picks for unified communications certifications

Best virtualization certifications 

Virtualization continues to rank near the top of every best IT certification list. The increased popularity of virtualization technologies translates to a high demand for skilled and certified virtualization professionals. Read more about the best virtualization certifications

Best VoIP and telephony certifications 

As traditional PBXs and PSTNs are being replaced by VoIP and IP-based PBXs, the telephony space is changing, and so are the credentials for IT professionals. Here are five of the best VoIP and telephony certifications to invest in this year.

Best web certifications 

The web design and development fields have undergone many changes over the years, and web certifications have evolved alongside the technologies. IT pros interested in validating their web skills should consider these top five certifications.

Best wireless networking certifications

The explosion of mobile devices and wireless networks in the enterprise spells a high demand for qualified IT professionals specializing in wireless technologies. Here are our top five wireless certification picks for 2019.

Wed, 27 Oct 2021 13:24:00 -0500 en text/html https://www.businessnewsdaily.com/10953-best-it-certifications.html
Killexams : CompTIA Certification Guide: Overview and Career Paths

Headquartered near Chicago, CompTIA is a nonprofit trade association made up of more than 2,000 member organizations and 3,000 business partners. Although the organization focuses on educating and certifying IT professionals, CompTIA also figures prominently in philanthropy and public policy advocacy.

CompTIA certification program overview

CompTIA’s vendor-neutral certification program is one of the best recognized in the IT industry. Since CompTIA developed its A+ credential in 1993, it has issued more than two million certifications.

In early 2018, CompTIA introduced its CompTIA Infrastructure Career Pathway. While you’ll still see the same familiar certifications that form the bedrock of the CompTIA certification portfolio, this new career pathway program more closely aligns CompTIA certifications to the real-world skills that IT professionals need to ensure success when managing and supporting IT infrastructures.

CompTIA certifications are grouped by skill set. Currently, CompTIA certs fall info four areas: Core, Infrastructure, Cybersecurity and Additional Professional certifications.

  • Core Certifications: Designed to build core foundational skills, CompTIA offers four Core certifications: IT Fundamentals+ (a pre-career certification focused on IT foundation framework), CompTIA A+ (focused on user support and device connectivity), CompTIA Network+ (targeting core system connections with endpoint devices), and CompTIA Security+ (focused on entry level cybersecurity skills).
     
  • Infrastructure Certifications: Designed to complement the Network+ credential, you’ll find three Infrastructure certifications: CompTIA Server+ (focused on issues related to server support and administration), CompTIA Cloud+ (covering hybrid cloud, virtual system administration and deploying network storage resources), and CompTIA Linux+ (focused on Linux operating system administration and management).
     
  • Cybersecurity Certifications: CompTIA offers three cybersecurity credentials: CompTIA CySA+ (CySA stands for Cyber Security Analyst, and targets IT security behavioral analysts), CASP+ (CompTIA Advanced Security Practitioner; focuses on professionals who design and implement security solutions), and the CompTIA PenTest+ (Penetration testing, targets professionals who conduct penetration and vulnerability testing).
     
  • Additional Professional Certifications: This category includes several credentials which don’t readily fit into any of the foregoing CompTIA career paths, including: CompTIA Project+, CompTIA CTT+ and CompTIA Cloud Essentials.

CompTIA Core Certifications

CompTIA IT Fundamentals+

CompTIA IT Fundamentals+ is ideal for beginners with a basic understanding of PC functionality and compatibility as well as familiarity with technology topics, such as hardware basics, software installation, security risks and prevention, and basic networking. It’s also ideal as a career planning or development tool for individuals beginning their IT careers or those seeking to make a career change. A single test is required to earn the credential. CompTIA launched a new IT Fundamentals+ test (Exam FC0-U61) in September 2018. This new test focuses on computing basics, database use, software development and IT infrastructure. The English version of the prior test (Exam FC0-U510) retires on July 15, 2019. Exams in other languages retire on December 1, 2019.

CompTIA A+

The CompTIA A+ certification has been described as an “entry-level rite of passage for IT technicians,” and for a good reason. This certification is designed for folks seeking a career as a help desk, support, service center or networking technician. It covers PC and laptop hardware, software installation, and configuration of computer and mobile operating systems. A+ also tests a candidate’s understanding of basic networking, troubleshooting and security skills, which serve as a springboard for CompTIA networking or security certifications or those offered by other organizations.

According to CompTIA, more than one million IT professionals hold the A+ certification. The A+ is required for Dell, Intel and HP service technicians and is recognized by the U.S. Department of Defense. CompTIA released new “Core” exams for the CompTIA A+ credential on January 15, 2019. These new exams provide additional focus on operational procedure competency and baseline security topics. Candidates must pass the Core 1 (exam 220-1001) and Core 2 (Exam 220-1002) exams. The Core 1 test targets virtualization, cloud computing, mobile devices, hardware, networking technology and troubleshooting. The Core 2 exams focuses on installation and configuring operating systems, troubleshooting software, operational procedures and security.

CompTIA Network+

Many IT professionals start with the A+ certification. While the A+ credential is recommended, if you have the experience and don’t feel a need for the A+, you can move directly to the CompTIA Network+ certification. It’s geared toward professionals who have at least nine months of networking experience. A candidate must be familiar with networking technologies, media, topologies, security, installation and configuration, and troubleshooting of common wired and wireless network devices. The Network+ certification is recommended or required by Dell, HP and Intel, and is also an accepted entry-point certification for the Apple Consultants Network. The Network+ credential meets the ISO 17024 standard and just like the A+, it is recognized by the U.S. DoD. A single test is required to earn the certification.

CompTIA Security+

CompTIA Security+ covers network security concepts, threats and vulnerabilities, access control, identity management, cryptography, and much more. Although CompTIA does not impose any prerequisites, the organization recommends that cert candidates obtain the Network+ credential and have at least two years of IT administration experience with a security focus. To obtain the Security+ certification candidates must pass on exam, SY0-501.

Infrastructure Certifications

CompTIA Linux+

The CompTIA Linux+ Powered by LPI certification is aimed at Linux network administrators with at least 12 months of Linux administration experience. Such experience should include installation, package management, GNU and Unix commands, shells, scripting, security and more. The A+ and Network+ certifications are recommended as a preamble to this certification but are not mandatory. Candidates must pass two exams (LX0-103 and LX0-104) to earn this credential. The exams must be taken in order, and candidates must pass test LX0-103 before attempting LX0-104. In 2018, CompTIA began testing a new beta test (XK1-004). The beta test offering ended October 22, 2018. New exams generally follow beta test tests so interested candidates should check the Linux+ web page for updates.

CompTIA Cloud+

As the cloud computing market continues to grow by leaps and bounds, the CompTIA Cloud+ certification has been keeping pace. This certification targets IT professionals with two to three years of experience in storage, networking or data center administration. A single exam, CV0-002, is required. It tests candidates’ knowledge of cloud technologies, hybrid and multicloud solutions, cloud markets, and incorporating cloud-based technology solutions into system operations.

CompTIA Server+

CompTIA Server+ aims at server administrators with 18 to 24 months of experience with server hardware and software technologies, and the A+ certification is recommended. The Server+ credential is recommended or required by HP, Intel and Lenovo for their server technicians. It is also recognized by Microsoft and the U.S. Department of Defense (DoD). A single exam, SK0-004, is required to achieve this credential.

CompTIA Cybersecurity Certifications

CompTIA Cybersecurity Analyst (CySA+)

As cybercrime increases, the requirement for highly skilled information security analysts will continue to increase as well. The Bureau of Labor Statistics (BLS) reports anticipated growth of 28 percent for information security analysts between 2016 and 2026, the fastest rate of growth for all occupations. One of the newer additions to the CompTIA certification portfolio is the Cybersecurity Analyst (CySA+) certification. The CySA+ credential is specifically designed to meet the ever-growing need for experienced, qualified information security analysts.

CySA+ credential holders are well versed in the use of system threat-detection tools, as well as the use of data and behavioral analytics to secure applications and systems from risks, threats and other vulnerabilities. CySA+ certification holders are not only able to monitor network behavior, but analyze results and create solutions to better protect against advanced persistent threats (APTs), intrusions, malware and the like.

CompTIA describes CySA+ as a bridge cert between the Security+ credential (requiring two years’ experience) and the master-level Advanced Security Practitioner Certification (CASP), which requires 10 years of experience. To earn a CySA+, candidates must pass a performance-based exam.

CompTIA Advanced Security Practitioner+ (CASP+)

While CompTIA no longer uses the “master” designation, the highly sought-after CASP+ certification is most certainly a master-level credential. Targeting practitioners, CASP is the only performance-based, hands-on certification currently available from CompTIA. This certification is designed for seasoned IT security professionals who plan, design and implement security solutions in an enterprise environment.

Although this certification doesn’t impose any explicit prerequisites, it’s not a bad idea to earn the Network+ and Security+ certifications before tackling the CASP exam. You should also have 10 years of IT administration experience plus a minimum of five years of technical security experience (thus securing this certification’s place as a “master” credential).

Booz Allen Hamilton, Network Solutions and Verizon Connect, among other companies, require CASP+ certification for certain positions. The U.S. Army and U.S. Navy also accept CASP+ as an industry-based certification required by employees and contractors who perform IT work in DoD data centers. The CASP+ certification requires that candidates pass the CAS-003 exam, which consists of 90 multiple-choice and performance-based questions.

CompTIA PenTest+

The existing additional to the CompTIA certification family is the CompTIA PenTest+. An intermediate-level credential, PenTest+ is designed to complement the CySA+. While CySA+ is defensive in nature (focusing on threat detection and response), the PenTest+ credential is offensive, focusing on using penetration testing to identify and manage network vulnerabilities across multiple spectra.

There are no mandatory prerequisites, but the Network+ and Security+ (or equivalent skills) are highly recommended, along with a minimum of two years of information security experience. Candidates pursuing the cybersecurity career path may take the PenTest+ or CySA+ credential in any order.

The test was released in July 2018, and is focused on communicating and reporting results, analyzing data, conducting penetration testing and scanning, and planning assessments. The test also tests a candidate’s knowledge of legal and compliance requirements.

Additional Professional Certifications

CompTIA Project+

The CompTIA Project+ certification focuses exclusively on project management and is ideal for project managers who are familiar with project lifecycles from planning to completion, who can finish a project on time and under budget. Project managers interested in this certification should have at least one year of project management experience overseeing small- to medium-sized projects. The Project+ credential requires that candidates pass a multiple-choice exam, PK0-004.

CompTIA Cloud Essentials

The CompTIA Cloud Essentials certification is geared toward individuals who understand the business aspects of cloud computing and how to move from in-house to cloud storage. In addition, they should be familiar with the impacts, risks and consequences of implementing a cloud-based solution. A single test is required to earn the credential.

CompTIA CTT+

The CompTIA Certified Technical Trainer (CTT+) certification is perfect for anyone interested in technical training. It covers instructor skills, such as preparation, presentation, communication, facilitation and evaluation, in vendor-neutral fashion. Adobe, Cisco, Dell, IBM, Microsoft and Ricoh all recommend CTT+ to their trainers and accept it in lieu of their own in-house trainer certifications.

Two exams are required for the CTT+ credential: CompTIA CTT+ Essentials (TK0-201) and either CTT+ Classroom Performance Trainer (TK0-202) or CTT+ Virtual Classroom Trainer (TK0-203).

The CTT+ Classroom Performance Trainer and CTT+ Virtual Classroom Trainer are performance-based exams. In this case, you must submit a video or recording of your classroom (or virtual classroom sessions), and complete a form that documents your training preparation, delivery and student evaluations.

In addition to certification levels, CompTIA groups its certifications into several career paths:

  • Information security
  • Network and cloud technologies
  • Hardware, services and infrastructure
  • IT management and strategy
  • Web and mobile
  • Software development
  • Training
  • Office productivity

The CompTIA Certifications page lets you pick a certification level and/or a career path and then returns a list of certifications to focus on. For example, one of the most popular career paths in IT is network administration. CompTIA’s Network and Cloud Technologies career path offers numerous certifications that can help you advance your network administration career, such as IT Fundamentals+, A+ and Network+ (Core certs), along with Cloud+ and Linux+ (Infrastructure certifications) and Cloud Essentials.

Those interested in network security (one of the fastest growing fields in IT) should consider the certifications in CompTIA’s Information Security career path. This includes all four of the Core credentials (IT Fundamentals, A+, Network+ and Security+) along with all cybersecurity certifications (CySA+, PenTest+ and CASP+).

CompTIA provides a comprehensive IT certification roadmap that encompasses certifications from CompTIA as well as a variety of other organizations, including Cisco, EC-Council, Microsoft, (ISC)2, ISACA, Mile2 and more.

Because CompTIA credentials do not focus on a single skill (such as networking or virtualization), CompTIA credential holders may find themselves in a variety of job roles depending on their experience, skill levels and areas of interest. Here are just a few of the possible careers that CompTIA credential holders may find themselves engaged in:

  • A+: Typically, A+ credential holders find work in support roles, such as support administrators, support technicians or support specialists.
     
  • Network+: Network+ professionals primarily work in network-related roles, such as network analysts, administrators or support specialists. Credential holders may also work as network engineers, field technicians or network help desk technicians.
     
  • CySA+ Security Analyst: Common roles for professionals interested in cybersecurity, information security and risk analysis may engage in roles that include security engineers, cybersecurity analysts or specialists, threat or vulnerability analysts, or analysts for security operations centers (SOCs).
     
  • Security+: Security spans a variety of jobs, such as network, system or security administrators, security managers, certified or administrators, and security consultants.
     
  • Server+: Roles for server professionals include storage and server administrators, as well as server support or IT/server technicians.
     
  • Linux+: Linux professionals often work in roles such as Linux database administrators, network administrators or web administrators.
     
  • Cloud+/Cloud Essentials: Cloud+ credential holders typically work as cloud specialists, developers or system and network administrators. Cloud Essentials professionals tend to work in areas related to cloud technical sales or business development.
     
  • CASP+: Common roles for CASP+ credential holders include cybersecurity specialists, InfoSec specialists, information security professionals and security architects.
     
  • Project+: Project+ credential holders typically engage in project leadership roles, such as project managers, coordinators and directors, or team leads.

While the examples above are by no means exhaustive, they provide an overview of some available careers. Your career choices are limited only by your interests, imagination and determination to achieve your personal goals.

CompTIA training and resources

CompTIA provides various and extensive training options, including classroom training, study materials and e-learning. A wide range of CompTIA Authorized Training Provider Partners (CAPPs), such as Global Knowledge, Learning Tree International and more, operate all over the world. Classroom and online/e-learning offerings range in cost from $2,000 to $4,000, depending on the particulars. Visit the CompTIA Training page for more details.

CompTIA works with third parties to offer self-study materials (the search tool is available here). Content that has been through a vetting process is branded with the CompTIA Approved Quality Content (CAQC) logo. Other materials that allow you to study at your own pace, such as audio segments, lesson activities and additional resources, are available through the CompTIA Marketplace.

Finally, every CompTIA A+, Linux+, Network+, Server+, Security+ and IT Fundamentals+ certification candidates must check out CertMaster, CompTIA’s online test prep tool. CertMaster helps you determine which courses you know well and those you need to brush up on, and suggests training to help you fill in the gaps.

Tue, 28 Jun 2022 12:00:00 -0500 en text/html https://www.businessnewsdaily.com/10718-comptia-certification-guide.html
Killexams : Successful AI Examples in Higher Education That Can Inspire Our Future

Over the past few years, news of the success of data-fed virtual teaching assistants and smart enrollment counselor chatbots has had the higher education world abuzz with the possibilities inherent in using artificial intelligence on campus. 

Colleges and universities hope AI will help them offload time-intensive administrative and academic tasks, make IT processes more efficient, boost enrollment in a climate of decline and deliver a better learning experience for students. On some campuses, these improvements are already taking place. 

While scaling up AI deployment at universities will take time due to the costs involved, some faculty members may also be resistant to AI on campus because they worry it will put them out of their jobs. 

The best way to convince potential stakeholders of the need for AI is to “opt for a problem-first approach,” suggests the Education Advisory Board, an education enrollment services and research company. 

“Market machine learning as a solution to strategic imperatives rather than just another flashy technology gimmick,” the company adds.

Another way to convince stakeholders is to highlight stories of successful AI deployments that also demonstrate tangible benefits. Let’s look at a few of those. 

MORE FROM EDTECH: AI and smart campuses are among higher ed tech to watch in 2020.

How Georgia Tech Used AI to Unburden Harried Teaching Assistants

At the Georgia Institute of Technology, many of the students in a master’s-level AI class were unaware that one of their teaching assistants, Jill Watson, wasn’t human. (This was despite the clue in her name, which refers to IBM’s Watson.) 

The class’s approximately 300 students posted about 10,000 messages a semester to an online message board, a volume nearly impossible for a regular assistant to handle, according to The Wall Street Journal. The class’s professor, Ashok Goel, tells Slate that while “the number of questions increases if you have more students … the number of different questions doesn’t really go up." So, he and his team created a system that could respond to those queries that were consistently repeated, and released Jill onto the message board. They populated Jill’s memory with tens of thousands of questions (and their answers) from past semesters. 

Not only did most students not realize Jill was virtual, she was also among the most effective teaching assistants the class had seen, answering questions with a 97 percent success rate, according to Slate. Jill had “learned to parse the context of queries and reply to them accurately.” 

Jill was one of nine teaching assistants for the course, and her success didn’t mean all the assistants would those their jobs. She couldn’t answer all of the questions — but more important, she couldn’t motivate students or help them with coursework. What Jill did was free up the human teaching assistants to do more meaningful work.

“Where humans cannot go, Jill will go. And what humans do not want to do, Jill can automate,” Goel tells DOGO News.

MORE FROM EDTECH: Universities use AI chatbots to Improve student services.

AI Freezes Summer Melt at Georgia State

In 2016, Georgia State University introduced an AI chatbot, Pounce, that reduced “summer melt” by 22 percent, which meant 324 additional students showed up for the first day of fall classes. “Summer melt” occurs when students who enroll in the spring drop out by the time school begins in the fall. Georgia State’s freshman gains came specifically from those students who had access to the chatbot in a randomized control trial, said the university in a statement.

How did this happen? Through smart text messaging.

The university already knew the advantages of communicating with students via text messages. It also was aware that its existing staff couldn’t possibly be burdened with texting answers to thousands of student queries, according to Campus Technology. It decided to partner with Boston-based AdmitHub, an education technology company that works on conversational AI technology powered by human expertise.

Almost 60 percent of Georgia State’s students are from low-income backgrounds, and many of them are the first in their families to attend college, so they need individual attention as well as financial aid, according to The Chronicle of Higher Education. Feeling confused by various required forms and not knowing which campus offices to go to for specific queries are among the reasons they don’t make it to the first day of classes in the fall. 

AdmitHub worked with Georgia State’s admissions and IT teams to identify these and other common obstacles to enrollment that students face, including financial aid applications, placement exams and class registration. Information and answers related to all of these subjects were fed into Pounce, and students could ask Pounce questions 24/7 via text messages on their smart devices.

In 2016, during the first summer of implementation in the randomized control trial, Pounce delivered more than 200,000 answers to questions asked by incoming freshmen. 

Every interaction was tailored to the specific student’s enrollment task,” says Scott Burke, assistant vice president of undergraduate admissions at Georgia State, on the university’s website. “We would have had to hire 10 full-time staff members to handle that volume of messaging without Pounce.”

The university has enhanced and continued to use Pounce. It has also expanded the chatbot’s role to other student success initiatives. 

MORE FROM EDTECH: Emotionally intelligent AI advances personalization on campus.

Tailored Instruction Meets Students’ Needs with AI

Universities aren’t just seeing declines in enrollment, they are also dealing with high dropout rates. Today’s college students need learning to be more engaging and personalized. Technology, especially AI, can help with both those issues. AI, fed with and trained by Big Data, can deliver a personalized learning experience, writes AI expert Lasse Rouhiainen in the Harvard Business Review. Professors can gain unique insights into the ways different students learn and provide suggestions on how to customize their teaching methods to their individual needs, notes Rouhiainen.

Further down the road, an AI-powered machine might even be able to read the expressions on students' faces to tell whether they are having trouble understanding lessons, according to Forbes

Meanwhile, AI is already making learning more engaging on several campuses. 

IBM Research and Rensselaer Polytechnic Institute have partnered on a new approach to help students learn Mandarin. They pair an AI-powered assistant with an immersive classroom environment that makes students feel as though they are in restaurant in China — or in a garden or a tai chi class — where they can practice speaking Mandarin with an AI chat agent. 

IBM and Rensselaer call the classroom the Cognitive Immersive Room, and it was developed at the Cognitive and Immersive Systems Lab, a research collaboration between the two entities.

Challenges in Scaling Up the Use of AI and Training IT Staff

While such discrete university AI projects have been successful, they also demonstrate that such initiatives are still very much driven by human intelligence. 

For instance, Georgia Tech’s Jill, in her earliest version, was fed more than 40,000 posts from online discussion forums to enable her to answer queries and converse with students. Similarly, Georgia State’s success with Pounce didn’t come easily, notes the Education Advisory Board. 

“In addition to the cost of the chatbot’s development, Georgia State’s ten-person team of admissions counselors spent months teaching Pounce how to respond accurately to students’ questions, another task added to a demanding workload,” according to EAB.

Aside from the amount of work this takes, it also means these AI tools will only be as good as the data they’re fed. While for small projects this may be feasible to do, the challenge becomes enormous when rolling out campuswide AI tools that need to use massive amounts of student and institutional data. An AI tool to answer any and all student questions needs a “heavy lift” database, notes Timothy M. Renick, vice president for enrollment management and student success at Georgia State, in The Chronicle of Higher Education.

Campuses where AI is being used in nonacademic areas face other challenges. Their experiences suggest that managing and using volumes of data requires staff beyond IT teams to be trained to use data and AI tools.

For instance, the University of Iowa has connected many campus buildings to computer systems that use AI to monitor them for energy efficiency and any problems. That means staff at these facilities need more than mechanical skills; at the very least, they will need to become effective at incorporating computers and data into their workflows in ways they aren’t today. This means they either need to acquire IT skills or the university IT department needs to offer more support for these teams.

“There’s going to be a skills gap that we’re all thinking about,” Don Guckert, vice president for facilities management at the University of Iowa, tells The Chronicle of Higher Education.

Fri, 03 Jan 2020 05:30:00 -0600 Shailaja Neelakantan en text/html https://edtechmagazine.com/higher/article/2020/01/successful-ai-examples-higher-education-can-inspire-our-future
Killexams : Did the Universe Just Happen? Killexams : The Atlantic | April 1988 | Did the Universe Just Happen? | Wright


More on science and technology from The Atlantic Monthly.

The Atlantic Monthly | April 1988
 

I. Flying Solo


d Fredkin is scanning the visual field systematically. He checks the instrument panel regularly. He is cool, collected, in control. He is the optimally efficient pilot.

The plane is a Cessna Stationair Six—a six-passenger single-engine amphibious plane, the kind with the wheels recessed in pontoons. Fredkin bought it not long ago and is still working out a few kinks; right now he is taking it for a spin above the British Virgin Islands after some minor mechanical work.

He points down at several brown-green masses of land, embedded in a turquoise sea so clear that the shadows of yachts are distinctly visible on its sandy bottom. He singles out a small island with a good-sized villa and a swimming pool, and explains that the compound, and the island as well, belong to "the guy that owns Boy George"—the rock star's agent, or manager, or something.

I remark, loudly enough to overcome the engine noise, "It's nice."

Yes, Fredkin says, it's nice. He adds, "It's not as nice as my island."

He's joking, I guess, but he's right. Ed Fredkin's island, which soon comes into view, is bigger and prettier. It is about 125 acres, and the hill that constitutes its bulk is a deep green—a mixture of reeds and cacti, sea grape and turpentine trees, machineel and frangipani. Its beaches range from prosaic to sublime, and the coral in the waters just offshore attracts little and big fish whose colors look as if they were coordinated by Alexander Julian. On the island's west side are immense rocks, suitable for careful climbing, and on the east side are a bar and restaurant and a modest hotel, which consists of three clapboard buildings, each with a few rooms. Between east and west is Fredkin's secluded island villa. All told, Moskito Island—or Drake's Anchorage, as the brochures call it—is a nice place for Fredkin to spend the few weeks of each year when he is not up in the Boston area tending his various other businesses.

In addition to being a self-made millionaire, Fredkin is a self-made intellectual. Twenty years ago, at the age of thirty-four, without so much as a bachelor's degree to his name, he became a full professor at the Massachusetts Institute of Technology. Though hired to teach computer science, and then selected to guide MIT's now eminent computer-science laboratory through some of its formative years, he soon branched out into more-offbeat things. Perhaps the most idiosyncratic of the courses he has taught is one on "digital physics," in which he propounded the most idiosyncratic of his several idiosyncratic theories. This theory is the reason I've come to Fredkin's island. It is one of those things that a person has to be prepared for. The preparer has to say, "Now, this is going to sound pretty weird, and in a way it is, but in a way it's not as weird as it sounds, and you'll see this once you understand it, but that may take a while, so in the meantime don't prejudge it, and don't casually dismiss it." Ed Fredkin thinks that the universe is a computer.

Fredkin works in a twilight zone of modern science—the interface of computer science and physics. Here two concepts that traditionally have ranked among science's most fundamental—matter and energy—keep bumping into a third: information. The exact relationship among the three is a question without a clear answer, a question vague enough, and basic enough, to have inspired a wide variety of opinions. Some scientists have settled for modest and sober answers. Information, they will tell you, is just one of many forms of matter and energy; it is embodied in things like a computer's electrons and a brain's neural firings, things like newsprint and radio waves, and that is that. Others talk in grander terms, suggesting that information deserves full equality with matter and energy, that it should join them in some sort of scientific trinity, that these three things are the main ingredients of reality.

Fredkin goes further still. According to his theory of digital physics, information is more fundamental than matter and energy. He believes that atoms, electrons, and quarks consist ultimately of bits—binary units of information, like those that are the currency of computation in a personal computer or a pocket calculator. And he believes that the behavior of those bits, and thus of the entire universe, is governed by a single programming rule. This rule, Fredkin says, is something fairly simple, something vastly less arcane than the mathematical constructs that conventional physicists use to explain the dynamics of physical reality. Yet through ceaseless repetition—by tirelessly taking information it has just transformed and transforming it further—it has generated pervasive complexity. Fredkin calls this rule, with discernible reverence, "the cause and prime mover of everything."

T THE RESTAURANT ON FREDKIN'S ISLAND THE FOOD is prepared by a large man named Brutus and is humbly submitted to diners by men and women native to nearby islands. The restaurant is open-air, ventilated by a sea breeze that is warm during the day, cool at night, and almost always moist. Between the diners and the ocean is a knee-high stone wall, against which waves lap rhythmically. Beyond are other islands and a horizon typically blanketed by cottony clouds. Above is a thatched ceiling, concealing, if the truth be told, a sheet of corrugated steel. It is lunchtime now, and Fredkin is sitting in a cane-and-wicker chair across the table from me, wearing a light cotton sport shirt and gray swimming trunks. He was out trying to windsurf this morning, and he enjoyed only the marginal success that one would predict on the basis of his appearance. He is fairly tall and very thin, and has a softness about him—not effeminacy, but a gentleness of expression and manner—and the complexion of a scholar; even after a week on the island, his face doesn't vary much from white, except for his nose, which is red. The plastic frames of his glasses, in a modified aviator configuration, surround narrow eyes; there are times—early in the morning or right after a nap—when his eyes barely qualify as slits. His hair, perennially semi-combed, is black with a little gray.

Fredkin is a pleasant mealtime companion. He has much to say that is interesting, which is fortunate because generally he does most of the talking. He has little curiosity about other people's minds, unless their interests happen to coincide with his, which few people's do. "He's right above us," his wife, Joyce, once explained to me, holding her left hand just above her head, parallel to the ground. "Right here looking down. He's not looking down saying, 'I know more than you.' He's just going along his own way."

The food has not yet arrived, and Fredkin is passing the time by describing the world view into which his theory of digital physics fits. "There are three great philosophical questions," he begins. "What is life? What is consciousness and thinking and memory and all that? And how does the universe work?" He says that his "informational viewpoint" encompasses all three. Take life, for example. Deoxyribonucleic acid, the material of heredity, is "a good example of digitally encoded information," he says. "The information that implies what a creature or a plant is going to be is encoded; it has its representation in the DNA, right? Okay, now, there is a process that takes that information and transforms it into the creature, okay?" His point is that a mouse, for example, is "a big, complicated informational process."

Fredkin exudes rationality. His voice isn't quite as even and precise as Mr. Spock's, but it's close, and the parallels don't end there. He rarely displays emotion—except, perhaps, the slightest sign of irritation under the most trying circumstances. He has never seen a problem that didn't have a perfectly logical solution, and he believes strongly that intelligence can be mechanized without limit. More than ten years ago he founded the Fredkin Prize, a $100,000 award to be given to the creator of the first computer program that can beat a world chess champion. No one has won it yet, and Fredkin hopes to have the award raised to $1 million.

Fredkin is hardly alone in considering DNA a form of information, but this observation was less common back when he first made it. So too with many of his ideas. When his world view crystallized, a quarter of a century ago, he immediately saw dozens of large-scale implications, in fields ranging from physics to biology to psychology. A number of these have gained currency since then, and he considers this trend an ongoing substantiation of his entire outlook.

Fredkin talks some more and then recaps. "What I'm saying is that at the most basic level of complexity an information process runs what we think of as physics. At the much higher level of complexity life, DNA—you know, the biochemical functions—are controlled by a digital information process. Then, at another level, our thought processes are basically information processing." That is not to say, he stresses, that everything is best viewed as information. "It's just like there's mathematics and all these other things, but not everything is best viewed from a mathematical viewpoint. So what's being said is not that this comes along and replaces everything. It's one more avenue of modeling reality, and it happens to cover the sort of three biggest philosophical mysteries. So it sort of completes the picture."

Among the scientists who don't dismiss Fredkin's theory of digital physics out of hand is Marvin Minsky, a computer scientist and polymath at MIT, whose renown approaches cultic proportions in some circles. Minsky calls Fredkin "Einstein-like" in his ability to find deep principles through simple intellectual excursions. If it is true that most physicists think Fredkin is off the wall, Minsky told me, it is also true that "most physicists are the ones who don't invent new theories"; they go about their work with tunnel vision, never questioning the dogma of the day. When it comes to the kind of basic reformulation of thought proposed by Fredkin, "there's no point in talking to anyone but a Feynman or an Einstein or a Pauli," Minsky says. "The rest are just Republicans and Democrats." I talked with Richard Feynman, a Nobel laureate at the California Institute of Technology, before his death, in February. Feynman considered Fredkin a brilliant and consistently original, though sometimes incautious, thinker. If anyone is going to come up with a new and fruitful way of looking at physics, Feynman said, Fredkin will.

Notwithstanding their moral support, though, neither Feynman nor Minsky was ever convinced that the universe is a computer. They were endorsing Fredkin's mind, not this particular manifestation of it. When it comes to digital physics, Ed Fredkin is flying solo.

He knows that, and he regrets that his ideas continue to lack the support of his colleagues. But his self-confidence is unshaken. You see, Fredkin has had an odd childhood, and an odd education, and an odd career, all of which, he explains, have endowed him with an odd perspective, from which the essential nature of the universe happens to be clearly visible. "I feel like I'm the only person with eyes in a world where everyone's blind," he says.

II. A Finely Mottled Universe


HE PRIME MOVER OF EVERYTHING, THE SINGLE principle that governs the universe, lies somewhere within a class of computer programs known as cellular automata, according to Fredkin.

The cellular automaton was invented in the early 1950s by John von Neumann, one of the architects of computer science and a seminal thinker in several other fields. Von Neumann (who was stimulated in this and other inquiries by the ideas of the mathematician Stanislaw Ulam) saw cellular automata as a way to study reproduction abstractly, but the word cellular is not meant biologically when used in this context. It refers, rather, to adjacent spaces—cells—that together form a pattern. These days the cells typically appear on a computer screen, though von Neumann, lacking this convenience, rendered them on paper.

In some respects cellular automata resemble those splendid graphic displays produced by patriotic masses in authoritarian societies and by avid football fans at American universities. Holding up large colored cards on cue, they can collectively generate a portrait of, say, Lenin, Mao Zedong, or a University of Southern California Trojan. More impressive still, one portrait can fade out and another crystallize in no time at all. Again and again one frozen frame melts into another It is a spectacular feat of precision and planning.

But suppose there were no planning. Suppose that instead of arranging a succession of cards to display, everyone learned a single rule for repeatedly determining which card was called for next. This rule might assume any of a number of forms. For example, in a crowd where all cards were either blue or white, each card holder could be instructed to look at his own card and the cards of his four nearest neighbors—to his front, back, left, and right—and do what the majority did during the last frame. (This five-cell group is known as the von Neumann neighborhood.) Alternatively, each card holder could be instructed to do the opposite of what the majority did. In either event the result would be a series not of predetermined portraits but of more abstract, unpredicted patterns. If, by prior agreement, we began with a USC Trojan, its white face might dissolve into a sea of blue, as whitecaps drifted aimlessly across the stadium. Conversely, an ocean of randomness could yield islands of structure—not a Trojan, perhaps, but at least something that didn't look entirely accidental. It all depends on the original pattern of cells and the rule used to transform it incrementally.

This leaves room for abundant variety. There are many ways to define a neighborhood, and for any given neighborhood there are many possible rules, most of them more complicated than blind conformity or implacable nonconformity. Each cell may, for instance, not only count cells in the vicinity but also pay attention to which particular cells are doing what. All told, the number of possible rules is an exponential function of the number of cells in the neighborhood; the von Neumann neighborhood alone has 232, or around 4 billion, possible rules, and the nine-cell neighborhood that results from adding corner cells offers 2512, or roughly 1 with 154 zeros after it, possibilities. But whatever neighborhoods, and whatever rules, are programmed into a computer, two things are always true of cellular automata: all cells use the same rule to determine future behavior by reference to the past behavior of neighbors, and all cells obey the rule simultaneously, time after time.

In the late 1950s, shortly after becoming acquainted with cellular automata, Fredkin began playing around with rules, selecting the powerful and interesting and discarding the weak and bland. He found, for example, that any rule requiring all four of a cell's immediate neighbors to be lit up in order for the cell itself to be lit up at the next moment would not provide sustained entertainment; a single "off" cell would proliferate until darkness covered the computer screen. But equally simple rules could create great complexity. The first such rule discovered by Fredkin dictated that a cell be on if an odd number of cells in its von Neumann neighborhood had been on, and off otherwise. After "seeding" a good, powerful rule with an irregular landscape of off and on cells, Fredkin could watch rich patterns bloom, some freezing upon maturity, some eventually dissipating, others locking into a cycle of growth and decay. A colleague, after watching one of Fredkin's rules in action, suggested that he sell the program to a designer of Persian rugs.

Today new cellular-automaton rules are formulated and tested by the "information-mechanics group" founded by Fredkin at MIT's computer-science laboratory. The core of the group is an international duo of physicists, Tommaso Toffoli, of Italy, and Norman Margolus, of Canada. They differ in the degree to which they take Fredkin's theory of physics seriously, but both agree with him that there is value in exploring the relationship between computation and physics, and they have spent much time using cellular automata to simulate physical processes. In the basement of the computer-science laboratory is the CAM—the cellular automaton machine, designed by Toffoli and Margolus partly for that purpose. Its screen has 65,536 cells, each of which can assume any of four colors and can change color sixty times a second.

The CAM is an engrossing, potentially mesmerizing machine. Its four colors—the three primaries and black—intermix rapidly and intricately enough to form subtly shifting hues of almost any gradation; pretty waves of deep blue or red ebb and flow with fine fluidity and sometimes with rhythm, playing on the edge between chaos and order.

Guided by the right rule, the CAM can do a respectable imitation of pond water rippling outward circularly in deference to a descending pebble, or of bubbles forming at the bottom of a pot of boiling water, or of a snowflake blossoming from a seed of ice: step by step, a single "ice crystal" in the center of the screen unfolds into a full-fledged flake, a six-edged sheet of ice riddled symmetrically with dark pockets of mist. (It is easy to see how a cellular automaton can capture the principles thought to govern the growth of a snowflake: regions of vapor that find themselves in the vicinity of a budding snowflake freeze—unless so nearly enveloped by ice crystals that they cannot discharge enough heat to freeze.)

These exercises are fun to watch, and they give one a sense of the cellular automaton's power, but Fredkin is not particularly interested in them. After all, a snowflake is not, at the visible level, literally a cellular automaton; an ice crystal is not a single, indivisible bit of information, like the cell that portrays it. Fredkin believes that automata will more faithfully mirror reality as they are applied to its more fundamental levels and the rules needed to model the motion of molecules, atoms, electrons, and quarks are uncovered. And he believes that at the most fundamental level (whatever that turns out to be) the automaton will describe the physical world with perfect precision, because at that level the universe is a cellular automaton, in three dimensions—a crystalline lattice of interacting logic units, each one "deciding" zillions of point in time. The information thus produced, Fredkin says, is the fabric of reality, the stuff of which matter and energy are made. An electron, in Fredkin's universe, is nothing more than a pattern of information, and an orbiting electron is nothing more than that pattern moving. Indeed, even this motion is in some sense illusory: the bits of information that constitute the pattern never move, any more than football fans would change places to slide a USC Trojan four seats to the left. Each bit stays put and confines its activity to blinking on and off. "You see, I don't believe that there are objects like electrons and photons, and things which are themselves and nothing else," Fredkin says. What I believe is that there's an information process, and the bits, when they're in certain configurations, behave like the thing we call the electron, or the hydrogen atom, or whatever."

HE READER MAY NOW HAVE A NUMBER OF questions that unless satisfactorily answered will lead to something approaching contempt for Fredkin's thinking. One such question concerns the way cellular automata chop space and time into little bits. Most conventional theories of physics reflect the intuition that reality is continuous—that one "point" in time is no such thing but, rather, flows seamlessly into the next, and that space, similarly, doesn't come in little chunks but is perfectly smooth. Fredkin's theory implies that both space and time have a graininess to them, and that the grains cannot be chopped up into smaller grains; that people and dogs and trees and oceans, at rock bottom, are more like mosaics than like paintings; and that time's essence is better captured by a digital watch than by a grandfather clock.

The obvious question is, Why do space and time seem continuous if they are not? The obvious answer is, The cubes of space and points of time are very, very small: time seems continuous in just the way that movies seem to move when in fact they are frames, and the illusion of spatial continuity is akin to the emergence of smooth shades from the finely mottled texture of a newspaper photograph.

The obvious answer, Fredkin says, is not the whole answer; the illusion of continuity is yet more deeply ingrained in our situation. Even if the ticks on the universal clock were, in some absolute sense, very slow, time would still seem continuous to us, since our perception, itself proceeding in the same ticks, would be no more finely grained than the processes being perceived. So too with spatial perception: Can eyes composed of the smallest units in existence perceive those units? Could any informational process sense its ultimate constituents? The point is that the basic units of time and space in Fredkin's reality don't just happen to be imperceptibly small. As long as the creatures doing the perceiving are in that reality, the units have to be imperceptibly small.

Though some may find this discreteness hard to comprehend, Fredkin finds a grainy reality more sensible than a smooth one. If reality is truly continuous, as most physicists now believe it is, then there must be quantities that cannot be expressed with a finite number of digits; the number representing the strength of an electromagnetic field, for example, could begin 5.23429847 and go on forever without failing into a pattern of repetition. That seems strange to Fredkin: wouldn't you eventually get to a point, around the hundredth, or thousandth, or millionth decimal place, where you had hit the strength of the field right on the nose? Indeed, wouldn't you expect that every physical quantity has an exactness about it? Well, you might and might not. But Fredkin does expect exactness, and in his universe he gets it.

Fredkin has an interesting way of expressing his insistence that all physical quantities be "rational." (A rational number is a number that can be expressed as a fraction—as a ratio of one integer to another. Expressed as a decimal, a rational number will either end, as 5/2 does in the form of 2.5, or repeat itself endlessly, as 1/7 does in the form of 0.142857142857142 . . .) He says he finds it hard to believe that a finite volume of space could contain an infinite amount of information. It is almost as if he viewed each parcel of space as having the digits describing it actually crammed into it. This seems an odd perspective, one that confuses the thing itself with the information it represents. But such an inversion between the realm of things and the realm of representation is common among those who work at the interface of computer science and physics. Contemplating the essence of information seems to affect the way you think.

The prospect of a discrete reality, however alien to the average person, is easier to fathom than the problem of the infinite regress, which is also raised by Fredkin's theory. The problem begins with the fact that information typically has a physical basis. Writing consists of ink; speech is composed of sound waves; even the computer's ephemeral bits and bytes are grounded in configurations of electrons. If the electrons are in turn made of information, then what is the information made of?

Asking questions like this ten or twelve times is not a good way to earn Fredkin's respect. A look of exasperation passes fleetingly over his face. "What I've tried to explain is that—and I hate to do this, because physicists are always doing this in an obnoxious way—is that the question implies you're missing a very important concept." He gives it one more try, two more tries, three, and eventually some of the fog between me and his view of the universe disappears. I begin to understand that this is a theory not just of physics but of metaphysics. When you disentangle these theories—compare the physics with other theories of physics, and the metaphysics with other ideas about metaphysics—both sound less far-fetched than when jumbled together as one. And, as a bonus, Fredkin's metaphysics leads to a kind of high-tech theology—to speculation about supreme beings and the purpose of life.

III. The Perfect Thing


DWARD FREDKIN WAS BORN IN 1934, THE LAST OF three children in a previously prosperous family. His father, Manuel, had come to Southern California from Russia shortly after the Revolution and founded a chain of radio stores that did not survive the Great Depression. The family learned economy, and Fredkin has not forgotten it. He can reach into his pocket, pull out a tissue that should have been retired weeks ago, and, with cleaning solution, make an entire airplane windshield clear. He can take even a well-written computer program, sift through it for superfluous instructions, and edit it accordingly, reducing both its size and its running time.

Manuel was by all accounts a competitive man, and he focused his competitive energies on the two boys: Edward and his older brother, Norman. Manuel routinely challenged Ed's mastery of fact, inciting sustained arguments over, say, the distance between the moon and the earth. Norman's theory is that his father, though bright, was intellectually insecure; he seemed somehow threatened by the knowledge the boys brought home from school. Manuel's mistrust of books, experts, and all other sources of received wisdom was absorbed by Ed.

So was his competitiveness. Fredkin always considered himself the smartest kid in his class. He used to place bets with other students on test scores. This habit did not endear him to his peers, and he seems in general to have lacked the prerequisites of popularity. His sense of humor was unusual. His interests were not widely shared. His physique was not a force to be reckoned with. He recalls, "When I was young—you know, sixth, seventh grade—two kids would be choosing sides for a game of something. It could be touch football. They'd choose everybody but me, and then there'd be a fight as to whether one side would have to take me. One side would say, 'We have eight and you have seven,' and they'd say, 'That's okay.' They'd be willing to play with seven." Though exhaustive in documenting his social alienation, Fredkin concedes that he was not the only unpopular student in school. "There was a socially active subgroup, probably not a majority, maybe forty percent, who were very socially active. They went out on dates. They went to parties. They did this and they did that. The others were left out. And I was in this big left-out group. But I was in the pole position. I was really left out."

Of the hours Fredkin spent alone, a good many were devoted to courting disaster in the name of science. By wiring together scores of large, 45-volt batteries, he collected enough electricity to conjure up vivid, erratic arcs. By scraping the heads off matches and buying sulfur, saltpeter, and charcoal, he acquired a good working knowledge of pyrotechnics. He built small, minimally destructive but visually impressive bombs, and fashioned rockets out of cardboard tubing and aluminum foil. But more than bombs and rockets, it was mechanisms that captured Fredkin's attention. From an early age he was viscerally attracted to Big Ben alarm clocks, which he methodically took apart and put back together. He also picked up his father's facility with radios and household appliances. But whereas Manuel seemed to fix things without understanding the underlying science, his son was curious about first principles.

So while other kids were playing baseball or chasing girls, Ed Fredkin was taking things apart and putting them back together Children were aloof, even cruel, but a broken clock always responded gratefully to a healing hand. "I always got along well with machines," he remembers.

After graduation from high school, in 1952, Fredkin headed for the California Institute of Technology with hopes of finding a more appreciative social environment. But students at Caltech turned out to bear a disturbing resemblance to people he had observed elsewhere. "They were smart like me," he recalls, "but they had the full spectrum and distribution of social development." Once again Fredkin found his weekends unencumbered by parties. And once again he didn't spend his free time studying. Indeed, one of the few lessons he learned is that college is different from high school: in college if you don't study, you flunk out. This he did a few months into his sophomore year. Then, following in his brother's footsteps, he joined the Air Force and learned to fly fighter planes.

T WAS THE AIR FORCE THAT FINALLY BROUGHT Fredkin face to face with a computer. He was working for the Air Proving Ground Command, whose function was to ensure that everything from combat boots to bombers was of top quality, when the unit was given the job of testing a computerized air-defense system known as SAGE (for "semi-automatic ground environment"), To test SAGE the Air Force needed men who knew something about computers, and so in 1956 a group from the Air Proving Ground Command, including Fredkin, was sent to MIT's Lincoln Laboratory and enrolled in computer-science courses. "Everything made instant sense to me," Fredkin remembers. "I just soaked it up like a sponge."

SAGE, when ready for testing, turned out to be even more complex than anticipated—too complex to be tested by anyone but genuine experts—and the job had to be contracted out. This development, combined with bureaucratic disorder, meant that Fredkin was now a man without a function, a sort of visiting scholar at Lincoln Laboratory. "For a period of time, probably over a year, no one ever came to tell me to do anything. Well, meanwhile, down the hall they installed the latest, most modern computer in the world—IBM's biggest, most powerful computer. So I just went down and started to program it." The computer was an XD-1. It was slower and less capacious than an Apple Macintosh and was roughly the size of a large house.

When Fredkin talks about his year alone with this dinosaur, you half expect to hear violins start playing in the background. "My whole way of life was just waiting for the computer to come along," he says. "The computer was in essence just the perfect thing." It was in some respects preferable to every other conglomeration of matter he had encountered—more sophisticated and flexible than other inorganic machines, and more logical than organic ones. "See, when I write a program, if I write it correctly, it will work. If I'm dealing with a person, and I tell him something, and I tell him correctly, it may or may not work."

The XD-1, in short, was an intelligence with which Fredkin could empathize. It was the ultimate embodiment of mechanical predictability, the refuge to which as a child he had retreated from the incomprehensibly hostile world of humanity. If the universe is indeed a computer, then it could be a friendly place after all.

During the several years after his arrival at Lincoln Lab, as Fredkin was joining the first generation of hackers, he was also immersing himself in physics—finally learning, through self-instruction, the lessons he had missed by dropping out of Caltech. It is this two-track education, Fredkin says, that led him to the theory of digital physics. For a time "there was no one in the world with the same interest in physics who had the intimate experience with computers that I did. I honestly think that there was a period of many years when I was in a unique position."

The uniqueness lay not only in the fusion of physics and computer science but also in the peculiar composition of Fredkin's physics curriculum. Many physicists acquire as children the sort of kinship with mechanism that he still feels, but in most cases it is later diluted by formal education; quantum mechanics, the prevailing paradigm in contemporary physics, seems to imply that at its core, reality, has truly random elements and is thus inherently unpredictable. But Fredkin escaped the usual indoctrination. To this day he maintains, as did Albert Einstein, that the common interpretation of quantum mechanics is mistaken—that any seeming indeterminacy in the subatomic world reflects only our ignorance of the determining principles, not their absence. This is a critical belief, for if he is wrong and the universe is not ultimately deterministic, then it cannot be governed by a process as exacting as computation.

After leaving the Air Force, Fredkin went to work for Bolt Beranek and Newman, a consulting firm in the Boston area, now known for its work in artificial intelligence and computer networking. His supervisor at BBN, J. C. R. Licklider, says of his first encounter with Fredkin, "It was obvious to me he was very unusual and probably a genius, and the more I came to know him, the more I came to think that that was not too elevated a description." Fredkin "worked almost continuously," Licklider recalls. "It was hard to get him to go to sleep sometimes." A pattern emerged. Licklider would give Fredkin a problem to work on—say, figuring out how to get a computer to search a text in its memory for an only partially specified sequence of letters. Fredkin would retreat to his office and return twenty or thirty hours later with the solution—or, rather, a solution; he often came back with the answer to a question different from the one that Licklider had asked. Fredkin's focus was intense but undisciplined, and it tended to stray from a problem as soon as he was confident that he understood the solution in principle.

This intellectual wanderlust is one of Fredkin's most enduring and exasperating traits. Just about everyone who knows him has a way of describing it: "He doesn't really work. He sort of fiddles." "Very often he has these great ideas and then does not have the discipline to cultivate the idea." "There is a gap between the quality of the original ideas and what follows. There's an imbalance there." Fredkin is aware of his reputation. In self-parody he once brought a cartoon to a friend's attention: A beaver and another forest animal are contemplating an immense man-made dam. The beaver is saying something like, "No, I didn't actually build it. But it's based on an idea of mine."

Among the ideas that congealed in Fredkin's mind during his stay at BBN is the one that gave him his current reputation as (depending on whom you talk to) a thinker of great depth and rare insight, a source of interesting but reckless speculation, or a crackpot.

IV. Tick by Tick, Dot by Dot


HE IDEA THAT THE UNIVERSE IS A COMPUTER WAS inspired partly by the idea of the universal computer. Universal computer, a term that can accurately be applied to everything from an IBM PC to a Cray supercomputer, has a technical, rigorous definition, but here its upshot will do: a universal computer can simulate any process that can be precisely described and perform any calculation that is performable.

This broad power is ultimately grounded in something very simple: the algorithm. An algorithm is a fixed procedure for converting input into output, for taking one body of information and turning it into another. For example, a computer program that takes any number it is given, squares it, and subtracts three is an algorithm. This isn't a very powerful algorithm; by taking a 3 and turning it into a 6, it hasn't created much new information. But algorithms become more powerful with recursion. A recursive algorithm is an algorithm whose output is fed back into it as input. Thus the algorithm that turned 3 into 6, if operating recursively, would continue, turning 6 into 33, then 33 into 1,086, then 1,086 into 1,179,393, and so on.

The power of recursive algorithms is especially apparent in the simulation of physical processes. While Fredkin was at BBN, he would use the company's Digital Equipment Corporation PDP-1 computer to simulate, say, two particles, one that was positively charged and one that was negatively charged, orbiting each other in accordance with the laws of electromagnetism. It was a pretty sight: two phosphor dots dancing, each etching a green trail that faded into yellow and then into darkness. But for Fredkin the attraction lay less in this elegant image than in its underlying logic. The program he had written took the particles' velocities and positions at one point in time, computed those variables for the next point in time, and then fed the new variables back into the algorithm to get newer variables—and so on and so on, thousands of times a second. The several steps in this algorithm, Fredkin recalls, were "very simple and very beautiful." It was in these orbiting phosphor dots that Fredkin first saw the appeal of his kind of universe—a universe that proceeds tick by tick and dot by dot, a universe in which complexity boils down to rules of elementary simplicity.

Fredkin's discovery of cellular automata a few years later permitted him further to indulge his taste for economy of information and strengthened his bond with the recursive algorithm. The patterns of automata are often all but impossible to describe with calculus yet easy to express algorithmically. Nothing is so striking about a good cellular automaton as the contrast between the simplicity of the underlying algorithm and the richness of its result. We have all felt the attraction of such contrasts. It accompanies the comprehension of any process, conceptual or physical, by which simplicity accommodates complexity. Simple solutions to complex problems, for example, make us feel good. The social engineer who designs uncomplicated legislation that will cure numerous social ills, the architect who eliminates several nagging design flaws by moving a single closet, the doctor who traces gastro-intestinal, cardiovascular, and respiratory ailments to a single, correctable cause—all feel the same kind of visceral, aesthetic satisfaction that must have filled the first caveman who literally killed two birds with one stone.

For scientists, the moment of discovery does not simply reinforce the search for knowledge; it inspires further research. Indeed, it directs research. The unifying principle, upon its apprehension, can elicit such devotion that thereafter the scientist looks everywhere for manifestations of it. It was the scientist in Fredkin who, upon seeing how a simple programming rule could yield immense complexity, got excited about looking at physics in a new way and stayed excited. He spent much of the next three decades fleshing out his intuition.

REDKIN'S RESIGNATION FROM BOLT BERANEK AND Newman did not surprise Licklider. "I could tell that Ed was disappointed in the scope of projects undertaken at BBN. He would see them on a grander scale. I would try to argue—hey, let's cut our teeth on this and then move on to bigger things." Fredkin wasn't biting. "He came in one day and said, 'Gosh, Lick, I really love working here, but I'm going to have to leave. I've been thinking about my plans for the future, and I want to make'—I don't remember how many millions of dollars, but it shook me—'and I want to do it in about four years.' And he did amass however many millions he said he would amass in the time he predicted, which impressed me considerably."

In 1962 Fredkin founded Information International Incorporated—an impressive name for a company with no assets and no clients, whose sole employee had never graduated from college. Triple-I, as the company came to be called, was placed on the road to riches by an odd job that Fredkin performed for the Woods Hole Oceanographic Institute. One of Woods Hole's experiments had run into a complication: underwater instruments had faithfully recorded the changing direction and strength of deep ocean currents, but the information, encoded in tiny dots of light on sixteen-millimeter film, was inaccessible to the computers that were supposed to analyze it. Fredkin rented a sixteen-millimeter movie projector and with a surprisingly simple modification turned it into a machine for translating those dots into terms the computer could accept.

This contraption pleased the people at Woods Hole and led to a contract with Lincoln Laboratory. Lincoln was still doing work for the Air Force, and the Air Force wanted its computers to analyze radar information that, like the Woods Hole data, consisted of patterns of light on film. A makeshift information-conversion machine earned Triple-I $10,000, and within a year the Air Force hired Fredkin to build equipment devoted to the task. The job paid $350,000—the equivalent today of around $1 million. RCA and other companies, it turned out, also needed to turn visual patterns into digital data, and "programmable film readers" that sold for $500,000 apiece became Triple-I's stock-in-trade. In 1968 Triple-I went public and Fredkin was suddenly a millionaire. Gradually he cashed in his chips. First he bought a ranch in Colorado. Then one day he was thumbing through the classifieds and saw that an island in the Caribbean was for sale. He bought it.

In the early 1960s, at the suggestion of the Defense Department's Advanced Research Projects Agency, MIT set up what would become its Laboratory for Computer Science. It was then called Project MAC, an acronym that stood for both "machine-aided cognition" and "multiaccess computer." Fredkin had connections with the project from the beginning. Licklider, who had left BBN for the Pentagon shortly after Fredkin's departure, was influential in earmarking federal money for MAC. Marvin Minsky—who would later serve on Triple-I's board, and by the end of 1967 owned some of its stock—was centrally involved In MAC's inception. Fredkin served on Project MAC's steering committee, and in 1966 he began discussing with Minsky the possibility of becoming a visiting professor at MIT. The idea of bringing a college dropout onto the faculty, Minsky recalls, was not as outlandish as it now sounds; computer science had become an academic discipline so suddenly that many of its leading lights possessed meager formal credentials. In 1968, after Licklider had come to MIT and become the director of Project MAC, he and Minsky convinced Louis Smullin, the head of the electrical-engineering department, that Fredkin was worth the gamble. "We were a growing department and we wanted exciting people," Smullin says. "And Ed was exciting."

Fredkin had taught for barely a year before he became a full professor, and not much later, in 1971, he was appointed the head of Project MAC—a position that was also short-lived, for in the fall of 1974 he began a sabbatical at the California Institute of Technology as a Fairchild Distinguished Scholar. He went to Caltech under the sponsorship of Richard Feynman. The deal, Fredkin recalls, was that he would teach Feynman more about computer science, and Feynman would teach him more about physics. While there, Fredkin developed an idea that has slowly come to be seen as a profound contribution to both disciplines. The idea is also—in Fredkin's mind, at least—corroborating evidence for his theory of digital physics. To put its upshot in brief and therefore obscure terms, Fredkin found that computation is not inherently irreversible and thus it is possible, in principle, to build a computer that doesn't use up energy and doesn't give off heat.

All computers on the market are irreversible. That is, their history of information processing cannot be inferred from their present informational state; you cannot look at the data they contain and figure out how they arrived at it. By the time the average computer tells you that 2 plus 2 equals 4, it has forgotten the question; for all it knows, you asked what 1 plus 3 is. The reason for this ignorance is that computers discharge information once it is no longer needed, so that they won't get clogged up.

In 1961 Rolf Landauer, of IBM's Thomas J. Watson Research Center, established that this destruction of information is the only part of the computational process that unavoidably involves the dissipation of energy. It takes effort, in other words, for a computer to forget things but not necessarily for it to perform other functions. Thus the question of whether you can, in principle, build a universal computer that doesn't dissipate energy in the form of heat is synonymous with the question of whether you can design a logically reversible universal computer, one whose computational history can always be unearthed. Landauer, along with just about everyone else, thought such a computer impossible; all past computer architectures had implied the regular discarding of information, and it was widely believed that this irreversibility was intrinsic to computation. But while at Caltech, Fredkin did one of his favorite things—he showed that everyone had been wrong all along.

Of the two kinds of reversible computers invented by Fredkin, the better known is called the billiard-ball computer. If it were ever actually built, it would consist of billiard balls ricocheting around in a labyrinth of "mirrors," bouncing off the mirrors at 45-degree angles, periodically banging into other moving balls at 90-degree angles, and occasionally exiting through doorways that occasionally would permit new balls to enter. To extract data from the machine, you would superimpose a grid over it, and the presence or absence of a ball in a given square at a given point in time would constitute information. Such a machine, Fredkin showed, would qualify as a universal computer; it could do anything that normal computers do. But unlike other computers, it would be perfectly reversible; to recover its history, all you would have to do is stop it and run it backward. Charles H. Bennett, of IBM's Thomas J. Watson Research Center, independently arrived at a different proof that reversible computation is possible, though he considers the billiard-ball computer to be in some respects a more elegant solution to the problem than his own.

The billiard-ball computer will never be built, because it is a platonic device, existing only in a world of ideals. The balls are perfectly round and hard, and the table perfectly smooth and hard. There is no friction between the two, and no energy is lost when balls collide. Still, although these ideals are unreachable, they could be approached eternally through technological refinement, and the heat produced by fiction and collision could thus be reduced without limit. Since no additional heat would be created by information loss, there would be no necessary minimum on the total heat emitted by the computer. "The cleverer you are, the less heat it will generate," Fredkin says.

The connection Fredkin sees between the billiard-ball computer and digital physics exemplifies the odd assortment of evidence he has gathered in support of his theory. Molecules and atoms and their constituents, he notes, move around in theoretically reversible fashion, like billiard balls (although it is not humanly possible, of course, actually to take stock of the physical state of the universe, or even one small corner of it, and reconstruct history by tracing the motion of microscopic particles backward). Well, he asks, given the theoretical reversibility of physical reality, doesn't the theoretical feasibility of a reversible computer lend credence to the claim that computation is reality's basis?

No and yes. Strictly speaking, Fredkin's theory doesn't demand reversible computation. It is conceivable that an irreversible process at the very core of reality could give rise to the reversible behavior of molecules, atoms, electrons, and the rest. After all, irreversible computers (that is, all computers on the market) can simulate reversible billiard balls. But they do so in a convoluted way, Fredkin says, and the connection between an irreversible substratum and a reversible stratum would, similarly, be tortuous—or, as he puts it, "aesthetically obnoxious." Fredkin prefers to think that the cellular automaton underlying reversible reality does its work gracefully.

Consider, for example, a variant of the billiard-ball computer invented by Norman Margolus, the Canadian in MIT's information-mechanics group. Margolus showed how a two-state cellular automaton that was itself reversible could simulate the billiard-ball computer using only a simple rule involving a small neighborhood. This cellular automaton in action looks like a jazzed-up version of the original video game, Pong. It is an overhead view of endlessly energetic balls ricocheting off clusters of mirrors and each other It is proof that a very simple binary cellular automaton can give rise to the seemingly more complex behavior of microscopic particles bouncing off each other. And, as a kind of bonus, these particular particles themselves amount to a computer. Though Margolus discovered this powerful cellular-automaton rule, it was Fredkin who had first concluded that it must exist and persuaded Margolus to look for it. "He has an intuitive idea of how things should be," Margolus says. "And often, if he can't come up with a rational argument to convince you that it should be so, he'll sort of transfer his intuition to you."

That, really, is what Fredkin is trying to do when he argues that the universe is a computer. He cannot give you a single line of reasoning that leads inexorably, or even very plausibly, to this conclusion. He can tell you about the reversible computer, about Margolus's cellular automaton, about the many physical quantities, like light, that were once thought to be continuous but are now considered discrete, and so on. The evidence consists of many little things—so many, and so little, that in the end he is forced to convey his truth by simile. "I find the supporting evidence for my beliefs in ten thousand different places," he says. "And to me it's just totally overwhelming. It's like there's an animal I want to find. I've found his footprints. I've found his droppings. I've found the half-chewed food. I find pieces of his fur, and so on. In every case it fits one kind of animal, and it's not like any animal anyone's ever seen. People say, Where is this animal? I say, Well, he was here, he's about this big, this that and the other. And I know a thousand things about him. I don't have him in hand, but I know he's there." The story changes upon retelling. One day it's Bigfoot that Fredkin's trailing. Another day it's a duck: feathers are everywhere, and the tracks are webbed. Whatever the animal, the moral of the story remains the same: "What I see is so compelling that it can't be a creature of my imagination."

V. Deus ex Machina


HERE WAS SOMETHING BOTHERSOME ABOUT ISAAC Newton's theory of gravitation. The idea that the sun exerts a pull on the earth, and vice versa, sounded vaguely supernatural and, in any event, was hard to explain. How, after all, could such "action at a distance" be realized? Did the earth look at the sun, estimate the distance, and consult the law of gravitation to determine where it should move and how fast? Newton sidestepped such questions. He fudged with the Latin phrase si esset: two bodies, he wrote, behave as if impelled by a force inversely proportional to the square of their distance. Ever since Newton, physics has followed his example. Its "force fields" are, strictly speaking, metaphorical, and its laws purely descriptive. Physicists make no attempt to explain why things obey the law of electromagnetism or of gravitation. The law is the law, and that's all there is to it.

Fredkin refuses to accept authority so blindly. He posits not only laws but also a law-enforcement agency: a computer. Somewhere out there, he believes, is a machinelike thing that actually keeps our individual bits of space abiding by the rule of the universal cellular automaton. With this belief Fredkin crosses the line between physics and metaphysics, between scientific hypothesis and cosmic speculation. If Fredkin had Newton's knack for public relations, if he stopped at saying that the universe operates as if it were a computer, he could Improve his stature among physicists while preserving the essence of his theory—the idea that the dynamics of physical reality will ultimately be better captured by a single recursive algorithm than by the mathematics of conventional physics, and that the continuity of time and space implicit in traditional mathematics is illusory.

Actually, some estimable physicists have lately been saying things not wholly unlike this stripped-down version of the theory. T. D. Lee, a Nobel laureate at Columbia University, has written at length about the possibility that time is discrete. And in 1984 Scientific American, not exactly a soapbox for cranks, published an article in which Stephen Wolfram, then of Princeton's Institute for Advanced Study, wrote, "Scientific laws are now being viewed as algorithms. . . . Physical systems are viewed as computational systems, processing information much the way computers do." He concluded, "A new paradigm has been born."

The line between responsible scientific speculation and off-the-wall metaphysical pronouncement was nicely illustrated by an article in which Tomasso Toffoli, the Italian in MIT's information-mechanics group, stayed barely on the responsible side of it. Published in the journal Physica D, the article was called "Cellular automata as an alternative to (rather than an approximation of) differential equations in modeling physics." Toffoli's thesis captured the core of Fredkin's theory yet had a perfectly reasonable ring to it. He simply suggested that the historical reliance of physicists on calculus may have been due not just to its merits but also to the fact that before the computer, alternative languages of description were not practical.

Why does Fredkin refuse to do the expedient thing—leave out the part about the universe actually being a computer? One reason is that he considers reprehensible the failure of Newton, and of all physicists since, to back up their descriptions of nature with explanations. He is amazed to find "perfectly rational scientists" believing in "a form of mysticism: that things just happen because they happen." The best physics, Fredkin seems to believe, is metaphysics.

The trouble with metaphysics is its endless depth. For every question that is answered, at least one other is raised, and it is not always clear that, on balance, any progress has been made. For example, where is this computer that Fredkin keeps talking about? Is it in this universe, residing along some fifth or sixth dimension that renders it invisible? Is it in some meta-universe? The answer is the latter, apparently, and to understand why, we need to return to the problem of the infinite regress, a problem that Rolf Landauer, among others, has cited with respect to Fredkin's theory. Landauer illustrates the problem by telling the old turtle story. A professor has just finished lecturing at some august university about the origin and structure of the universe, and an old woman in tennis shoes walks up to the lectern. "Excuse me, sir, but you've got it all wrong," she says. "The truth is that the universe is sitting on the back of a huge turtle." The professor decides to humor her. "Oh, really?" he asks. "Well, tell me, what is the turtle standing on?" The lady has a ready reply: "Oh, it's standing on another turtle." The professor asks, "And what is that turtle standing on?" Without hesitation, she says, "Another turtle." The professor, still game, repeats his question. A look of impatience comes across the woman's face. She holds up her hand, stopping him in mid-sentence. "Save your breath, sonny," she says. "It's turtles all the way down."

The infinite-regress problem afflicts Fredkin's theory in two ways, one of which we have already encountered: if matter is made of information, what is the information made of? And even if one concedes that it is no more ludicrous for information to be the most fundamental stuff than for matter or energy to be the most fundamental stuff, what about the computer itself? What is it made of? What energizes it? Who, or what, runs it, or set it in motion to begin with?

HEN FREDKIN IS DISCUSSING THE PROBLEM OF THE infinite regress, his logic seems variously cryptic, evasive, and appealing. At one point he says, "For everything in the world where you wonder, 'What is it made out of?' the only thing I know of where the question doesn't have to be answered with anything else is for information." This puzzles me. Thousands of words later I am still puzzled, and I press for clarification. He talks some more. What he means, as near as I can tell, is what follows.

First of all, it doesn't matter what the information is made of, or what kind of computer produces it. The computer could be of the conventional electronic sort, or it could be a hydraulic machine made of gargantuan sewage pipes and manhole covers, or it could be something we can't even imagine. What's the difference? Who cares what the information consists of? So long as the cellular automaton's rule is the same in each case, the patterns of information will be the same, and so will we, because the structure of our world depends on pattern, not on the pattern's substrate; a carbon atom, according to Fredkin, is a certain configuration of bits, not a certain kind of bits.

Besides, we can never know what the information is made of or what kind of machine is processing it. This point is reminiscent of childhood conversations that Fredkin remembers having with his sister, Joan, about the possibility that they were part of a dream God was having. "Say God is in a room and on his table he has some cookies and tea," Fredkin says. "And he's dreaming this whole universe up. Well, we can't reach out and get his cookies. They're not in our universe. See, our universe has bounds. There are some things in it and some things not." The computer is not; hardware is beyond the grasp of its software. Imagine a vast computer program that contained bodies of information as complex as people, motivated by bodies of information as complex as ideas. These "people" would have no way of figuring out what kind of computer they owed their existence to, because everything they said, and everything they did—including formulate metaphysical hypotheses—would depend entirely on the programming rules and the original input. As long as these didn't change, the same metaphysical conclusions would be reached in an old XD-1 as in a Kaypro 2.

This idea—that sentient beings could be constitutionally numb to the texture of reality—has fascinated a number of people, including, lately, computer scientists. One source of the fascination is the fact that any universal computer can simulate another universal computer, and the simulated computer can, because it is universal, do the same thing. So it is possible to conceive of a theoretically endless series of computers contained, like Russian dolls, in larger versions of themselves and yet oblivious of those containers. To anyone who has lived intimately with, and thought deeply about, computers, says Charles Bennett, of IBM's Watson Lab, this notion is very attractive. "And if you're too attracted to it, you're likely to part company with the physicists." Physicists, Bennett says, find heretical the notion that anything physical is impervious to expertment, removed from the reach of science.

Fredkin's belief in the limits of scientific knowledge may sound like evidence of humility, but in the end it permits great ambition; it helps him go after some of the grandest philosophical questions around. For example, there is a paradox that crops up whenever people think about how the universe came to be. On the one hand, it must have had a beginning. After all, things usually do. Besides, the cosmological evidence suggests a beginning: the big bang. Yet science insists that it is impossible for something to come from nothing; the laws of physics forbid the amount of energy and mass in the universe to change. So how could there have been a time when there was no universe, and thus no mass or energy?

Fredkin escapes from this paradox without breaking a sweat. Granted, he says, the laws of our universe don't permit something to come from nothing. But he can imagine laws that would permit such a thing; in fact, he can imagine algorithmic laws that would permit such a thing. The conservation of mass and energy is a consequence of our cellular automaton's rules, not a consequence of all possible rules. Perhaps a different cellular automaton governed the creation of our cellular automation—just as the rules for loading software are different from the rules running the program once it has been loaded.

What's funny is how hard it is to doubt Fredkin when with such assurance he makes definitive statements about the creation of the universe—or when, for that matter, he looks you in the eye and tells you the universe is a computer. Partly this is because, given the magnitude and intrinsic intractability of the questions he is addressing, his answers aren't all that bad. As ideas about the foundations of physics go, his are not completely out of the ball park; as metaphysical and cosmogonic speculation goes, his isn't beyond the pale.

But there's more to it than that. Fredkin is, in his own odd way, a rhetorician of great skill. He talks softly, even coolly, but with a low-key power, a quiet and relentless confidence, a kind of high-tech fervor. And there is something disarming about his self-awareness. He's not one of these people who say crazy things without having so much as a clue that you're sitting there thinking what crazy things they are. He is acutely conscious of his reputation; he knows that some scientists are reluctant to invite him to conferences for fear that he'll say embarrassing things. But he is not fazed by their doubts. "You know, I'm a reasonably smart person. I'm not the smartest person in the world, but I'm pretty smart—and I know that what I'm involved in makes perfect sense. A lot of people build up what might be called self-delusional systems, where they have this whole system that makes perfect sense to them, but no one else ever understands it or buys it. I don't think that's a major factor here, though others might disagree." It's hard to disagree, when he so forthrightly offers you the chance.

Still, as he gets further from physics, and more deeply into philosophy, he begins to try one's trust. For example, having tackled the question of what sort of process could generate a universe in which spontaneous generation is impossible, he aims immediately for bigger game: Why was the universe created? Why is there something here instead of nothing?

HEN THIS SUBJECT COMES UP, WE ARE SITTING IN the Fredkins' villa. The living area has pale rock walls, shiny-clean floors made of large white ceramic tiles, and built-in bookcases made of blond wood. There is lots of air—the ceiling slopes up in the middle to at least twenty feet—and the air keeps moving; some walls consist almost entirely of wooden shutters that, when open, let the sea breeze pass as fast as it will. I am glad of this. My skin, after three days on Fredkin's island, is hot, and the air, though heavy, is cool. The sun is going down.

Fredkin, sitting on a white sofa, is talking about an interesting characteristic of some computer programs, including many cellular automata: there is no shortcut to finding out what they will lead to. This, indeed, is a basic difference between the "analytical" approach associated with traditional mathematics, including differential equations, and the "computational" approach associated with algorithms. You can predict a future state of a system susceptible to the analytic approach without figuring out what states it will occupy between now and then, but in the case of many cellular automata, you must go through all the intermediate states to find out what the end will be like: there is no way to know the future except to watch it unfold.

This indeterminacy is very suggestive. It suggests, first of all, why so many "chaotic" phenomena, like smoke rising from a cigarette, are so difficult to predict using conventional mathematics. (In fact, some scientists have taken to modeling chaotic systems with cellular automata.) To Fredkin, it also suggests that even if human behavior is entirely determined, entirely inevitable, it may be unpredictable; there is room for "pseudo free will" in a completely mechanistic universe. But on this particular evening Fredkin is interested mainly in cosmogony, in the implications of this indeterminacy for the big question: Why does this giant computer of a universe exist?

It's simple, Fredkin explains: "The reason is, there is no way to know the answer to some question any faster than what's going on."

Aware that he may have said something enigmatic, Fredkin elaborates. Suppose, he says, that there is an all-powerful God. "And he's thinking of creating this universe. He's going to spend seven days on the job—this is totally allegorical—or six days on the job. Okay, now, if he's as all-powerful as you might imagine, he can say to himself, 'Wait a minute, why waste the time? I can create the whole thing, or I can just think about it for a minute and just realize what's going to happen so that I don't have to bother.' Now, ordinary physics says, Well, yeah, you got an all-powerful God, he can probably do that. What I can say is—this is very interesting—I can say I don't care how powerful God is; he cannot know the answer to the question any faster than doing it. Now, he can have various ways of doing it, but he has to do every Goddamn single step with every bit or he won't get the right answer. There's no shortcut."

Around sundown on Fredkin's island all kinds of insects start chirping or buzzing or whirring. Meanwhile, the wind chimes hanging just outside the back door are tinkling with methodical randomness. All this music is eerie and vaguely mystical. And so, increasingly, is the conversation. It is one of those moments when the context you've constructed falls apart, and gives way to a new, considerably stranger one. The old context in this case was that Fredkin is an iconoclastic thinker who believes that space and time are discrete, that the laws of the universe are algorithmic, and that the universe works according to the same principles as a computer (he uses this very phrasing in his most circumspect moments). The new context is that Fredkin believes that the universe is very literally a computer and that it is being used by someone, or something, to solve a problem. It sounds like a good-news/bad-news joke: the good news is that our lives have purpose; the bad news is that their purpose is to help some remote hacker estimate pi to nine jillion decimal places.

So, I say, you're arguing that the reason we're here is that some being wanted to theorize about reality, and the only way he could test his theories was to create reality? "No, you see, my explanation is much more abstract. I don't imagine there is a being or anything. I'm just using that to talk to you about it. What I'm saying is that there is no way to know what the future is any faster than running this [the universe] to get to that [the future]. Therefore, what I'm assuming is that there is a question and there is an answer, okay? I don't make any assumptions about who has the question, who wants the answer, anything."

But the more we talk, the closer Fredkin comes to the religious undercurrents he's trying to avoid. "Every astrophysical phenomenon that's going on is always assumed to be just accident," he says. "To me, this is a fairly arrogant position, in that intelligence—and computation, which includes intelligence, in my view—is a much more universal thing than people think. It's hard for me to believe that everything out there is just an accident." This sounds awfully like a position that Pope John Paul II or Billy Graham would take, and Fredkin is at pains to clarify his position: "I guess what I'm saying is—I don't have any religious belief. I don't believe that there is a God. I don't believe in Christianity or Judaism or anything like that, okay? I'm not an atheist, I'm not an agnostic, I'm just in a simple state. I don't know what there is or might be. But what I can say is that it seems likely to me that this particular universe we have is a consequence of something I would call intelligent." Does he mean that there's something out there that wanted to get the answer to a question? "Yeah." Something that set up the universe to see what would happen? "In some way, yes."

VI. The Language Barrier


N 1974, UPON RETURNING TO MIT FROM CALTECH, Fredkin was primed to revolutionize science. Having done the broad conceptual work (concluding that the universe is a computer), he would enlist the aid of others in taking care of the details—translating the differential equations of physics into algorithms, experimenting with cellular-automaton rules and selecting the most elegant, and, eventually, discovering The Rule, the single law that governs every bit of space and accounts for everything. "He figured that all he needed was some people who knew physics, and that it would all be easy," Margolus says.

One early obstacle was Fredkin's reputation. He says, "I would find a brilliant student; he'd get turned on to this stuff and start to work on it. And then he would come to me and say, 'I'm going to work on something else.' And I would say, 'Why?' And I had a few very honest ones, and they would say, 'Well, I've been talking to my friends about this and they say I'm totally crazy to work on it. It'll ruin my career. I'll be tainted forever.'" Such fears were not entirely unfounded. Fredkin is one of those people who arouse either affection, admiration, and respect, or dislike and suspicion. The latter reaction has come from a number of professors at MIT, particularly those who put a premium on formal credentials, proper academic conduct, and not sounding like a crackpot. Fredkin was never oblivious of the complaints that his work wasn't "worthy of MIT," nor of the movements, periodically afoot, to sever, or at least weaken, his ties to the university. Neither were his graduate students.

Fredkin's critics finally got their way. In the early 1980s, while he was serving briefly as the president of Boston's CBS-TV affiliate, someone noticed that he wasn't spending much time around MIT and pointed to a faculty rule limiting outside professional activities. Fredkin was finding MIT "less and less interesting" anyway, so he agreed to be designated an adjunct professor. As he recalls the deal, he was going to do a moderate amount of teaching and be paid an "appropriate" salary. But he found the actual salary insulting, declined payment, and never got around to teaching. Not surprisingly, he was not reappointed adjunct professor when his term expired, in 1986. Meanwhile, he had so nominally discharged his duties as the head of the information-mechanics group that the title was given to Toffoli.

Fredkin doubts that his ideas will achieve widespread acceptance anytime soon. He believes that most physicists are so deeply immersed in their kind of mathematics, and so uncomprehending of computation, as to be incapable of grasping the truth. Imagine, he says, that a twentieth-century time traveler visited Italy in the early seventeenth century and tried to reformulate Galileo's ideas in terms of calculus. Although it would be a vastly more powerful language of description than the old one, conveying its importance to the average scientist would be nearly impossible. There are times when Fredkin breaks through the language barrier, but they are few and far between. He can sell one person on one idea, another on another, but nobody seems to get the big picture. It's like a painting of a horse in a meadow, he says"Everyone else only looks at it with a microscope, and they say, 'Aha, over here I see a little brown pigment. And over here I see a little green pigment.' Okay. Well, I see a horse."

Fredkin's research has nevertheless paid off in unanticipated ways. Comparing a computer's workings and the dynamics of physics turned out to be a good way to figure out how to build a very efficient computer—one that harnesses the laws of physics with great economy. Thus Toffoli and Margolus have designed an inexpensive but powerful cellular-automata machine, the CAM 6. The "machine' is actually a circuit board that when inserted in a personal computer permits it to orchestrate visual complexity at a speed that can be matched only by general-purpose computers costing hundreds of thousands of dollars. Since the circuit board costs only around $1,500, this engrossing machine may well entice young scientific revolutionaries into joining the quest for The Rule. Fredkin speaks of this possibility in almost biblical terms, "The big hope is that there will arise somewhere someone who will have some new, brilliant ideas," he says. "And I think this machine will have a dramatic effect on the probability of that happening."

But even if it does happen, it will not ensure Fredkin a place in scientific history. He is not really on record as believing that the universe is a computer. Although some of his tamer insights have been adopted, fleshed out, and published by Toffoli or Margolus, sometimes in collaboration with him, Fredkin himself has published nothing on digital physics. His stated rationale for not publishing has to do with, of all things, lack of ambition. "I'm just not terribly interested," he says. "A lot of people are fantastically motivated by publishing. It's part of a whole thing of getting ahead in the world." Margolus has another explanation: "Writing something down in good form takes a lot of time. And usually by the time he's done with the first or second draft, he has another wonderful idea that he's off on."

These two theories have merit, but so does a third: Fredkin can't write for academic journals. He doesn't know how. His erratic, hybrid education has left him with a mixture of terminology that neither computer scientists nor physicists recognize as their native tongue. Further, he is not schooled in the rules of scientific discourse; he seems just barely aware of the line between scientific hypothesis and philosophical speculation. He is not politic enough to confine his argument to its essence: that time and space are discrete, and that the state of every point in space at any point in time is determined by a single algorithm. In short, the very background that has allowed Fredkin to see the universe as a computer seems to prevent him from sharing his vision. If he could talk like other scientists, he might see only the things that they see.


Robert Wright is the author of
Three Scientists and Their Gods: Looking for Meaning in an Age of Information, The Moral Animal: Evolutionary Psychology and Everyday Life, and Nonzero: The Logic of Human Destiny.
Copyright © 2002 by The Atlantic Monthly Group. All rights reserved.
The Atlantic Monthly; April 1988; Did the Universe Just Happen?; Volume 261, No. 4; page 29.
Wed, 24 Nov 2010 05:10:00 -0600 text/html https://www.theatlantic.com/past/docs/issues/88apr/wright.htm
Killexams : Data Storage Market 2022 Global Size, Segments, Share and Growth Factor Analysis, Top Key Players Research Report 2028 | COVID-19 Impact on Industry

The MarketWatch News Department was not involved in the creation of this content.

Jul 11, 2022 (The Expresswire) -- Global “Data Storage Market” report provides a complete examination of market dynamics, size, share, current developments, and trending business strategies. This report gives a complete analysis of different segments on the basis of type, application, and region. The report outlines the market characteristics, and market growth Data Storage industry, categorized by type, application, and consumer sector. In addition, it provides a comprehensive analysis of aspects involved in market development before and after the Covid-19 pandemic.

Get a sample PDF of report at - https://www.researchreportsworld.com/enquiry/request-sample/21045994

Market Analysis and Insights: Global Data Storage Market

This report focuses on global Data Storage Market Share, also covers the segmentation data of other regions in regional level and county level.

Due to the COVID-19 pandemic, the global Data Storage market analysis is estimated to be worth USD million in 2022 and is forecast to a readjusted size of USD million by 2028 with a CAGR of during the review period. Fully considering the economic change by this health crisis, by Type, Data Storage accounting for the Data Storage global market in 2021, is projected to value USD million by 2028, growing at a revised CAGR in the post-COVID-19 period. While by Application, leading segment, accounting for over percent market share in 2021, and altered to an CAGR throughout this forecast period.

In United States the Data Storage market size is expected to grow from USD million in 2021 to USD million by 2028, at a CAGR of during the forecast period.

Get a sample Copy of the Data Storage Market Report 2022

List of TOP KEY PLAYERS in Data Storage Market Report are -

● HPE ● NetApp ● Dell EMC ● IBM ● Pure Storage ● Hitachi ● Fujitsu ● Huawei ● Western Digital

Global Data Storage Market: Segment Analysis
The research report includes specific segments by region (country), by company, by Type and by Application. This study provides information about the sales and revenue during the historic and forecasted period of 2017 to 2028. Understanding the segments helps in identifying the importance of different factors that aid the market growth.

The Data Storage Market is Segmented by Types:

● All-Flash Arrays ● Hybrid Storage Arrays ● HDD Arrays

The Data Storage Market is Segmented by Applications:

● IT and Telecom ● BFSI ● Healthcare ● Education ● Manufacturing ● Media and Entertainment ● Energy and Utility ● Retail and e-Commerce ● Others

Enquire before purchasing this report -https://www.researchreportsworld.com/enquiry/pre-order-enquiry/21045994

Geographically, this report is segmented into several key regions, with sales, revenue, market share and growth Rate of Data Storage in these regions, from 2022 to 2028, covering

● North America (United States, Canada and Mexico) ● Europe (Germany, UK, France, Italy, Russia and Turkey etc.) ● Asia-Pacific (China, Japan, Korea, India, Australia, Indonesia, Thailand, Philippines, Malaysia and Vietnam) ● South America (Brazil, Argentina, Columbia etc.) ● Middle East and Africa (Saudi Arabia, UAE, Egypt, Nigeria and South Africa)

Important Features that are under Offering and Key Highlights of the Reports:

● To get insights into the countries in Data Storage market. ● To get extensive-ranging information about the top key players in this industry, their product portfolios, and key strategies embraced by the players. ● To get a complete overview of the Data Storage market. ● To know the future view for the market. ● To learn about the market plans that are being adopted by top organizations. ● To understand the supreme affecting driving and restraining forces in the market and their influence on the global market.

Key Questions Addressed by the Report

● New products/service competitors are exploring? ● Key players in the Data Storage market and how extreme is the competition? ● What are the future market trends that manufacturers are emphasizing on in the future updates? ● For each segment, what are the crucial opportunities in the market? ● What are the key growth strategies embraced by key market players in the market? ● What are the key success strategies adopted by major competitors in the market?

An exhaustive and professional study of the global Data Storage market report has been completed by industry professionals and presented in the most particular manner to present only the details that matter the most. The report mainly focuses on the most dynamic information of the global market.

Purchase this report (Price 2900 USD for single user license)

https://www.researchreportsworld.com/purchase/21045994

Major Points from Table of Contents:

1 Data Storage Market Overview

1.1 Data Storage Product Scope

1.2 Data Storage Segment by Type

1.3 Data Storage Segment by Application

1.4 Data Storage Market Estimates and Forecasts (2017-2028)

2 Data Storage Estimates and Forecasts by Region

2.1 Global Data Storage Market Size by Region: 2017 VS 2021 VS 2028

2.2 Global Data Storage Market Scenario by Region (2017-2021)

2.3 Global Market Estimates and Forecasts by Region (2022-2028)

2.4 Geographic Market Analysis: Market Facts and Figures

3 Global Data Storage Competition Landscape by Players

3.1 Global Top Data Storage Players by Sales (2017-2021)

3.2 Global Top Data Storage Players by Revenue (2017-2021)

3.3 Global Data Storage Market Share by Company Type (Tier 1, Tier 2 and Tier 3) and (based on the Revenue in Data Storage as of 2020)

3.4 Global Data Storage Average Price by Company (2017-2021)

3.5 Manufacturers Data Storage Manufacturing Sites, Area Served, Product Type

3.6 Manufacturers Mergers and Acquisitions, Expansion Plans

4 Global Data Storage Market Size by Type

4.1 Global Data Storage Historic Market Review by Type (2017-2021)

4.2 Global Market Estimates and Forecasts by Type (2022-2028)

4.2.3 Global Price Forecast by Type (2022-2028)

5 Global Data Storage Market Size by Application

5.1 Global Data Storage Historic Market Review by Application (2017-2021)

5.2 Global Market Estimates and Forecasts by Application (2022-2028)

6 North America Data Storage Market Facts and Figures

6.1 North America Data Storage by Company

6.2 North America Data Storage Breakdown by Type

6.3 North America Data Storage Breakdown by Application

7 Europe Data Storage Market Facts and Figures

8 China Data Storage Market Facts and Figures

9 Japan Data Storage Market Facts and Figures

10 Southeast Asia Data Storage Market Facts and Figures

11 India Data Storage Market Facts and Figures

12 Company Profiles and Key Figures in Data Storage Business

13 Data Storage Manufacturing Cost Analysis

13.1 Data Storage Key Raw Materials Analysis

13.1.1 Key Raw Materials

13.1.2 Key Raw Materials Price Trend

13.1.3 Key Suppliers of Raw Materials

13.2 Proportion of Manufacturing Cost Structure

13.3 Manufacturing Process Analysis of Data Storage

13.4 Data Storage Industrial Chain Analysis

14 Marketing Channel, Distributors and Customers

14.1 Marketing Channel

14.2 Data Storage Distributors List

14.3 Data Storage Customers

15 Market Dynamics

15.1 Data Storage Market Trends

15.2 Data Storage Drivers

15.3 Data Storage Market Challenges

15.4 Data Storage Market Restraints

Continued…

Browse complete table of contents at -

https://www.researchreportsworld.com/TOC/21045994#TOC

About Us:

Research Reports World is the credible source for gaining the market reports that will provide you with the lead your business needs. At Research Reports World, our objective is providing a platform for many top-notch market research firms worldwide to publish their research reports, as well as helping the decision makers in finding most suitable market research solutions under one roof. Our aim is to provide the best solution that matches the exact customer requirements. This drives us to provide you with custom or syndicated research reports.

Contact Us:

Research Reports World

Phone: US (+1) 424 253 0807

UK (+44) 203 239 8187

Email: sales@researchreportsworld.com

Website: https://www.researchreportsworld.com/

Other Reports Here:

Online test Proctoring Market Size, Share 2022 Industry Growth, Demand, Emerging Technologies, Sales Revenue, Key Players Analysis, Development Status, Opportunity Assessment and Industry Expansion Strategies 2028

Natural Pulse Protein Market Size, Share, Growth Factors, 2022 Global Top Leaders, Development Strategy, Future Trends and Forecast 2028

Resistive Industrial Touchscreen Market Size, Share, Growth Factors, 2022 Global Top Leaders, Development Strategy, Future Trends and Forecast 2028

Sample Cylinders Market Size, Future Growth, Share, 2022 Leading Players, Industry Updates, Business Prospects and Future Investments by Forecast to 2028

Bluetooth Conference Speaker Market Share, Size, Growth, Business Demand, Global Analysis, Trends, Research and Forecast to 2022-2028

Press Release Distributed by The Express Wire

To view the original version on The Express Wire visit Data Storage Market 2022 Global Size, Segments, Share and Growth Factor Analysis, Top Key Players Research Report 2028 | COVID-19 Impact on Industry

COMTEX_409970133/2598/2022-07-11T05:30:57

Is there a problem with this press release? Contact the source provider Comtex at editorial@comtex.com. You can also contact MarketWatch Customer Service via our Customer Center.

The MarketWatch News Department was not involved in the creation of this content.

Sun, 10 Jul 2022 21:30:00 -0500 en-US text/html https://www.marketwatch.com/press-release/data-storage-market-2022-global-size-segments-share-and-growth-factor-analysis-top-key-players-research-report-2028-covid-19-impact-on-industry-2022-07-11
Killexams : Zebulon B. Vance High School Killexams : Access Denied

You don't have permission to access "http://www.usnews.com/education/best-high-schools/north-carolina/districts/charlotte-mecklenburg-schools/zebulon-b-vance-high-school-14579" on this server.

Reference #18.733b2f17.1658134006.8533619

Mon, 25 Apr 2022 16:20:00 -0500 text/html https://www.usnews.com/education/best-high-schools/north-carolina/districts/charlotte-mecklenburg-schools/zebulon-b-vance-high-school-14579
Killexams : Financial Information and Analysis

Quality-based project management is a field that entails managing people, resources and budgets to ensure that projects are completed on-time, on-budget and within performance.

This minor is available to all undergraduate students and is intended for students who want to prepare themselves for potential careers in project-centered work. This is the most popular minor in the School of Business and draws students from the School of Engineering, the School of Arts & Sciences and the School of Business. Students with this minor can pursue jobs in construction management, contract administration and cost engineering, to name a few.

All courses are 3 credits unless noted.

 

Your Future

A unique benefit of this minor is that students can pursue certification through the Project Management Institute (PMI) ™ after completing the requirements of minor. PMI’s Certified Associate of Project Management (CAPM) ® is considered the pathway to the Project Management Professional (PMP) ® certification that is rapidly emerging as one of the fastest growing professional certifications in many industries and career areas.

Additionally, certain students may opt to sit for the American Society for Quality’s “Certified Quality Improvement Associate” test since the Quality Management course covers the body of knowledge for that particular certification. Students who pursue the minor are under no obligation to sit for the CAPM ® or CQIA ®, which require an application and separate fee, completed and paid for by the student.

Quality-Based Project Management Minor

Clarkson University offers a Minor in Quality-Based Project Management, a field that entails managing people, resources, and budgets to ensure projects are completed on-time, on-budget, and within performance.  
The minor is:

  • Open to all students in all majors and is useful for engineers, science and business majors.  
  • An opportunity for students to pursue certification through the Project Management Institute (PMI), a great resume item and source of value recognized by employers.
  • An opportunity for certain students to sit for the American Society for Quality’s 
    “Certified Quality Improvement Associate” exam.

To earn a minor in quality-based project management, students must maintain a 2.0 average in the five 3-credit courses, distributed in the following fashion:

 

Quality-Based Project Management Minor Core I

Students must take the following courses:

  • OM/EM380 Project Management 
  • OM/EM451 Quality Management and Lean Enterprise 

Students must take one of the following courses:

  • OM/EM484 Advanced Project Management 
  • EM482 Systems Engineering and Management

Quality-Based Project Management Minor Core II

Students must complete one of the following options:

Option 1

  • OS/EM286 Organizational Behavior 
  • OS352 Strategic Human Resource Management 

Option 2

  • OS/EM286 Organizational Behavior 
  • OS466 Negotiations and Relationship Management 

Option 3

  • OM/EM331 Operations and Supply Chain Management
  • OM/EM476 Management of Technology
  • or EM482 Systems Engineering and Management
  • or EM/OM484 Advanced Project Management , whichever is not selected in Core I
Tue, 22 Aug 2017 03:12:00 -0500 en text/html https://www.clarkson.edu/undergraduate/financial-information-and-analysis
Killexams : These are 10 best U.S. jobs of 2022, according to new research—many pay over $100,000

The "perfect job" doesn't exist — but the most popular, sought-after roles have a few attributes in common: competitive salaries, a positive workplace culture and clear opportunities for career advancement. According to new research from Indeed, many of the top jobs are seeing rapid growth and offering six-figure salaries. 

On Tuesday, Indeed released its latest report highlighting the 20 best jobs in America for 2022, focusing on positions with an average salary of at least $75,000, which was calculated as the mean of salaries listed in job postings for that role, and at least 25 job postings per one million total postings on the website. Indeed ranked jobs based on these two metrics and the growth rate of openings on its website for each job between 2019 and 2022. 

Jobs in health care dominate the list, claiming four spots in the top 10, including the No. 1 job on the list: Registered nurse. As demand for health services continues to soar amid the ongoing Covid-19 pandemic, an aging population and rise of people living with chronic health conditions are also fueling the need for more health-care professionals. 

Here are the 10 best U.S. jobs in 2022, according to Indeed, along with the full list of the top 20 jobs here

1. Registered nurse

Average salary: $84,074

Percent of growth in number of job postings, 2019-2022: 34%

Education requirements: Associate or bachelor's degree in nursing, pass the National Council Licensure Examination for Registered Nurses

2. Optometrist 

Average salary: $118,389

Percent of growth in number of job postings, 2019-2022: 121%

Education requirements: Bachelor's degree, pass the Optometry Admission Test, Doctor of Optometry degree, pass National Board of Examiners in Optometry exam

3. Site reliability engineer 

Average salary: $137,324

Percent of growth in number of job postings, 2019-2022: 175%

Education requirements: Bachelor's degree, optional certifications

4. Real estate agent 

Average salary: $82,015

Percent of growth in number of job postings, 2019-2022: 23%

Education requirements: High school diploma, pass real estate exam

5. Pharmacist 

Average salary: $101,589

Percent of growth in number of job postings, 2019-2022: 83%

Education requirements: Bachelor's degree, Doctor of Pharmacy degree, pass North American Pharmacist Licensure Exam

6. Over-the-road truck driver 

Average salary: $102,678

Percent of growth in number of job postings, 2019-2022: 242%

Education requirements: High school diploma or equivalent (not always required), commercial learner's permit (CLP), commercial driver's license (CDL)

7. Software engineer 

Average salary: $126,127

Percent of growth in number of job postings, 2019-2022: 87%

Education requirements: Associate degree or bachelor's degree, optional certifications

8. Nurse practitioner 

Average salary: $128,105

Percent of growth in number of job postings, 2019-2022: 100% 

Education requirements: Associate or bachelor's degree in nursing, pass the NCLEX-RN exam, Master of Science in Nursing (MSN) or Doctor of Nursing Practice (DNP), pass a national NP board certification exam

9. Product designer 

Average salary: $113,722

Percent of growth in number of job postings, 2019-2022: 128%

Education requirements: None; Bachelor's degree recommended 

10. Solar consultant 

Tue, 12 Jul 2022 06:03:00 -0500 en text/html https://www.cnbc.com/2022/07/12/these-are-10-best-us-jobs-of-2022-according-to-new-research.html
Killexams : Aslam Moosa

Aslam has been an entrepreneur in skill development and training space. Over the last 12 years, he has built one of India's leading spoken English training brand Speakwell. Speakwell has over 100 centres spread across India.

Replicating his success in the English language space, Aslam is currently mentoring other intrapreneurs to build brand extension verticals aligned to spoken English like IT training and Govt. test preparation with new brand identities. Stek is the new IT training brand launched recently and Promising Career Academy is the competitive exams training brand.

Before turning to entrepreneurship, Aslam was an IT professional working with companies like IBM and Hexaware where he was into business development. His passion for making an impact through skill enhancement led him to start Speakwell and today he is on a mission to maximize this impact. His vision is to be able to enhance the employability of 10 million Indian youth through his ventures.

http://www.speakwell.co.in

Sun, 17 Feb 2019 16:22:00 -0600 en text/html https://www.entrepreneur.com/author/aslam-moosa
Killexams : Blockchain-Based Startup Altswitch Awarded Gold Place By Industry Experts At Future Innovation Summit Dubai

(MENAFN- GlobeNewsWire - Nasdaq)

NEW YORK, NEW YORK, July 12, 2022 (GLOBE NEWSWIRE) --

Future Innovation Summit V2 was a recently concluded event that was hosted by the Private Office of H.H Sheikh Saqer Bin Mohammed Al Qasimi under the leadership of H.E Adnan Al Noorani held last May 11-12 at the Meydan Hotel in Dubai. In the audience were government officials, prominent personalities, and organizations who gathered together for critical discussions around courses of space, the metaverse, blockchain technology, and sustainability. Financial and blockchain experts in attendance include Dr. Michael Gebert (Chairman and founding member of the European Blockchain Association), Christian Noll (General Manager - IBM Consulting IBM Middle East and Africa), and Rizwan Sajan (Founder and Chairman of Danube Group) among many others. The event was also attended by H.H Sheikh Saqer Bin Mohammed Al Qasimi himself.
Many competitors were vying to win a spot in the top three with projects that were providing innovative solutions to niche industries that catered to wide audiences. AltSwitch's hardware wallet debut was able to secure a unanimous vote to win the gold place with its sleek and luxurious design, multi-layer security encryption technology, and NFT integrated display. They combined these elements into a formula for success in the eyes of the judges eager on assessing the viability and competitive edge of all its contestants.
A vital point of the presentation was the importance of self-custody and the social responsibility of digital asset security by blockchain-based companies which the CEO also talked about on stage. He emphasized that the alarming growth in digital asset theft could not be left unchecked as it damages cryptocurrency's reputation and each of its victims' financial futures. He implied that businesses should maintain a human-to-human element of conducting business as more than mere statistics but people's lives and hard-earned living. Another was the company's mission to empower people's self-custodial capability by placing the security of their assets in their own hands aside from conventional financial institutions.
Securing one of the coveted winning spots in the competition opens the company's future to even further heights as it solidifies its reputation as an up-and-coming figure in the digital asset security industry. Some of the prizes from the Sheikh's private office include (but are not limited to) business development assistance, company registration & office location in Dubai, as well as an endorsement to strategic partners & government entities. These provided a key opportunity for the company to establish a base of operations in the UAE which has positioned itself as a haven for emerging technologies and innovations and strategic access to the GCC. Only time will tell the level of success AltSwitch will achieve but this win is a strong step in the right direction for them in a globally competitive industry.
AltSwitch is a blockchain-based company building products with an ecosystem of decentralized apps and services that aim to make self-custody and decentralized finance more secure and accessible. Its core team members in attendance were CEO Carl Munsayac, CMO Dragos Petrovan, CTO Vlad Sulea, CCO Carlo James Nuque, CDO Clark Abella, CSMO Maria Kaye Labay, CHRO Shari Ashley, and CPRO Nicholas Sledziona who were able to catch the attention of both industry experts as well as established venture capital and investment firm delegates with their presentation.

AltSwitch is making a statement worthy of attention as the next big player in the hardware wallet industry. This is a perfect opportunity to give this startup a closer look and potentially become an early investor in a breakout company that will introduce a revolutionary solution to a multi-billion dollar problem.
Watch the video here:
To learn more visit:
Website:
Telegram:




Tags Blockchain Crypto AltSwitch Related Links

MENAFN12072022004107003653ID1104520195


Legal Disclaimer:
MENAFN provides the information “as is” without warranty of any kind. We do not accept any responsibility or liability for the accuracy, content, images, videos, licenses, completeness, legality, or reliability of the information contained in this article. If you have any complaints or copyright issues related to this article, kindly contact the provider above.

Tue, 12 Jul 2022 09:42:00 -0500 Date text/html https://menafn.com/1104520195/Blockchain-Based-Startup-Altswitch-Awarded-Gold-Place-By-Industry-Experts-At-Future-Innovation-Summit-Dubai
M2150-728 exam dump and training guide direct download
Training Exams List