View Feedback

View previously submitted feedback (before October 27th, 2016)

Displaying 1 - 25 of 184

October 30, 2017

The development team is generally on the right track in defining the four component skills and evaluation criteria for Critical Thinking. I teach history classes at CNM Community College, usually dealing with students in the "emerging" and the "developing" stages, and assignments we are likely to use in our assessments may range from short term papers to book reviews or short essay responses to questions asked by the instructor on an exam or a take-home writing assignment. Given the latter, here are some thoughts and suggestions for modifying some of the wording related to this GE Learning Outcome.

1. In many cases, especially at the "emerging" stage, it is the instructor who sets or "delineates" the problem or the question for the student. For instance, my book review assignment may require that the students "describe and analyze" an assigned reading. Or, an essay question may ask them to explain the major causes of the American Revolution. In both cases, I have set or "delineated" the question, not the students, though they do have to apply critical thinking to complete the task at hand. In short, I believe that defining the first of the four component skills of critical thinking as "Problem Setting" is too vague and limited. I'd rather see a broader definition, such as "skillfully approaching or conceptualizing a research task, problem, or question." I like other definitions of critical thinking that I've seen, as well, that describe it as a "mode of thinking about a subject or content"--not just a problem or question.

2. I noticed that for "Evidence Acquisition" it states that students need to gather evidence . . . "from a mix of sources" to qualify as emerging, and yet that requirement of a "mix" is missing (or assumed?) in the more advanced stages. Here again, some CT activities, especially at lower levels of instruction, may not require that students draw from a mix of sources, and perhaps "a mix" or multiple sources should be linked to one of the higher achievement levels and not to emerging.

Type of feedback

  • Outcomes

Feedback is being provided for the following Outcome Document

Critical Thinking

October 25, 2017

The twin goals of statewide general education and institutional flexibility are not as compatible as we'd like to think. If we're going to have a state-wide program, communication deserves a higher priority than 6 credits. All institutions offer public speaking so it is entirely feasible to add oral communication to the model. These skills are essential to succeed in today's competitive job market, as employer surveys rate oral communication their #1, and it's their #1 complaint about new employees who are college-educated. Add the negative effects of smart-phone obsession and our poisoned political climate, and its clear we need citizens trained in articulate, ethical speech. Thus:

Written communication: 6 credits... Communication (written), critical thinking, digital literacy
Oral communication: 3 credits... Communication (oral), critical thinking, personal & social responsibility

I'd also like to see a second lab science requirement. Some said this may be impractical for smallerl institutions, so I'm less adamant about it. But we live in a complex global economy that requires graduates who are prepared to solve the big problems.

Type of feedback

  • Models

October 23, 2017

I reject the notion that New Mexico's General Education framework needs to be re-designed. As a practical matter, if our higher education institutions conform to North Central Association/Higher Learning Commission Accreditation standards and so maintain accreditation, then a very substantial reason for keeping our current general education model per the New Mexico Higher Education Department is intact. To still argue for a change in the general education framework would mean that either (a) we no longer believe our general education framework serves our students or (b) we are not effectively assessing within the spirit of the current NMHED framework.

Supposing we are no longer serving our students, we embark on a major mission of discovery and reform. The research and articles explored by the New Model's committee members appear to have at the national level identified deficiencies in what have been defined as Essential Skills. We as members of higher education would need to not only agree that our students lack such skills, but that such skills are representative of general education. Closer inspection of many of the Essential Skills reveals an intellectual maturity much more readily associated with a Bachelor's level of performance, if not at graduate degree competency levels. Stipulating that such skills are indeed associated with general education means such skills manifest upon completion of an Associate's Degree. I remain unconvinced that synthesis of ideas, integration of multiple perspectives, judgement of the merits of arguments, and perception of the consequences of social responsibility fall within the necessary skill set for a student seeking an Associate's Degree in Chemistry. Surely the student experiences these facets of inquiry through exploration of general education courses, but a student does not acquire these skills with proficiency unless a Bachelor's or beyond is pursued. Compelling our Associate Degree students to attain such levels certainly does call for an overhaul of general education, necessitating longer contact hours per semester and/or fewer course-level student learning outcomes.

If we are not effectively assessing within the spirit of the current NMHED model, then superficially NCA/HLC would voice their concerns and accreditation would potentially be in jeopardy. All NM higher education institutions are acutely aware of the need for effective assessment and most provide satisfactory assessment. When assessment is judged unsatisfactory, institutions are provided time and guidance to improve assessment efforts (while I do not profess to know NM accreditation history, I suspect that if the state routinely failed in that regard I would be aware of that reality.) When assessment RESULTS are judged unsatisfactory, then schools, departments, and ultimately faculty develop strategies anticipated to improve student learning. Without NCA/HLC judging our assessment efforts unsatisfactory, that can only leave us as institutions, departments, and faculty judging our own assessment efforts to be unsatisfactory -- for if we didn't, then why propose a redesign of the NMHED model? As a faculty member of significant years, I have neither heard from my administration nor fellow faculty a concern over our assessment efforts being unsatisfactory. And certainly there have been no loud nor soft calls for a redesign of assessment.

With neither accrediting agencies nor administration nor faculty voicing pronounced concerns over the NMHED general education framework, I cannot see a need for a new model for New Mexico's general education framework.

Type of feedback

  • General Feedback

October 23, 2017

THE NEW GENERAL EDUCATION INITIATIVE

Impressions of the Initiative as outlined at the New Mexico State University site:

The Initiative is not intending to design general education from the ground up as suggested. A ground up design would include more facets of general education under scrutiny and potential change, including an exploration of the merits of traditional fixed term length semesters versus open term length, competency based course structuring; the pedagogical strengths and weaknesses of distance learning versus face-to-face learning; and the efficacy of use of service learning and internships to name a few.

Instead the Initiative is largely about reformulating assessment and implementing an additional layer of assessment. A better catch-phrase for the initiative would be “Recreating General Education Assessment and Proposing Assessment of Uber-General Skills Transcendent of General Education Content Areas”.

The Six Enumerated Content Areas and Their Skills Proposed in the New General Education Model:

The draft model specifies the six Content Areas of
1) Communications (6 hours)
2) Mathematics (3 hours)
3) Science (4 hours)
4) Social & Behavioral Science (3 hours)
5) Humanities (3 hours)
6) Creative and Fine Arts (3 hours)

Twenty-two credit hours fulfill the above six areas and nine additional hours may arise from the above Content Areas or from other areas including foreign languages, interdisciplinary studies, etc.

The current New Mexico Higher Education Department General Education Core Competencies enumerate the following five Content Areas:

1) Area I: Communications (9 hours)
2) Area II: Mathematics (3 hours)
3) Area III: Laboratory Science (8 hours)
4) Area IV: Social/Behavioral Sciences (6-9 hours)
5) Area V: Humanities and Fine Arts. (6-9 hours)

The new Content Areas are identical in spirit to the already adhered-to NMHED content areas, and so at least in terms of content areas no re-designing of general education is taking place. There is, however, a change in the credit hour allocations as the new model permits institutions to choose from among content areas not specifically enumerated in the draft model’s six.

Continuing with the draft model and its six content areas, there are five Essential Skills proposed to be closely aligned with their respective content areas. Collectively, these skills are

A) Communication
B) Critical Thinking
C) Information & Digital Literacy
D) Personal & Social Responsibility
E) Quantitative Reasoning

Each Essential Skill has a corresponding rubric, and every rubric itemizes component skills together with what constitutes “Emerging”, “Developing”, and “Proficient” student performance levels. The component skills are broad-based and are designed to apply to general education courses within the content area of interest. For example, a general education course within the Content Area of Communications has Essential Skill Components of “Genre and Medium Awareness”, “Strategies for Understanding and Evaluating Messages”, and “Evaluation and Production of Arguments”. Likewise a general education course designated as supporting Quantitative Reasoning has the three Essential Skill Components of “Communication/Representation of Quantitative Information”, “Analysis of Quantitative Arguments”, and “Application of Quantitative Models”.

General Education Assessment under the New Model:

Every department has its own general education courses in which general education assessment is performed. Further, the process of assessment is experienced from different vantages ranging from the student, to instructor, to assessment coordinator, to the director of assessment for the institution, to the New Mexico Department of Higher Education itself. All assessment activities begin with an assessment tool that the instructor provides to the student. By design, a portion of the student’s performance is judged by the assessment tool (an exam question, a project, etc.) and in turn a portion of the student’s course grade is determined by the assessment tool. The more comprehensive the tool, the more information gleaned from it, and the more time spent on it by the student. For an assessment tool to possess the characteristics necessary for the new model, it must possess sufficient depth and breadth to meet the student learning outcomes of the course, the NMHED competencies, and the Essential Skill Components.

Suppose we are engaged in assessing a general education physics class. Our assessment tool is a small battery of exam questions on the application of Newton’s Laws of Motion. Since the course being taught is a Content Area of Science under the new model, the assessment tool must assess at least one of the Essential Skills of Critical Thinking, Personal & Social Responsibility, or Quantitative Reasoning. Of course, the tool must also be assessing the Science Content Area Components of “Scientific Literacy”, “Scientific Reasoning”, and/or “Experimental Techniques, Methods, and Design”. Unless NMHED adopts the draft model, it may also need to assess one or more of the NMHED five enumerated competencies for NMHED Area III.

The very simplest scenario of assessment would be as follows:

Assessment Tool: Third Law of Motion Question
Course-Level SLO: Apply Newton’s Laws of Motion to solve problems
Draft Model Science Content Area: “Scientific Reasoning”
Draft Model Quantitative Reasoning Essential Skill: “Application of Quantitative Models”
NMHED Competency: Apply Quantitative Analysis to Scientific Problems

Perhaps a priori the tool is designed to address all at once the course-level, draft model science content area, draft model quantitative reasoning skill, and NMHED competency. This means an all-new assessment tool is created. For already implemented assessment tools, a kind of taxonomy of the assessment tool must be done. There may be fortuitously designed tools that already align nicely and there may be ones that are not quite so discernable in their alignments. The above table applies to one question hypothesized to be addressing one course-level SLO, one Content Area and one Essential Skill.

There may be as many as three course-level SLOs being assessed within the course, the course may need to assess as many as three Science Content Area components, and as many as three Essential Skill components. Further, rubric scoring of the course-level SLO must lend itself to rubric scoring of the Science Content Area and rubric scoring of the Essential Skill – unless separate assessment tools are designed to overcome misalignment or abandon the challenge of alignment altogether. Were it a medical prescription, the choice would be between one very, very large horse pill or thirty small pills.

It is quite a feat to imagine what a data point might look like from such an assessment endeavor. One possibility is that it is an ordered triple: (course-level SLO rubric score, content area rubric score, essential skill rubric score). With potentially multiple course-level SLOs assessed, it becomes an ordered quadruple: (course-level SLO, course-level SLO rubric score, content area rubric score, essential skill rubric score). These scoring objectives rest upon the student providing enough work to fill such performance dimensionality as well as the instructor scoring the performance. It seems quite possible that assessment could overwhelm an entire exam or drive the construction of the entirety of a project. Further, each instructor becomes quite a data manager after removing their (hopefully very durable) assessor hat.

An assessment coordinator gets to compile such multidimensional data and conduct rather sophisticated analyses. Qualified assessment coordinators likely need advanced data management and data analysis skills. Orchestrating a department’s assessment efforts, from design of assessment tools through data analysis, would begin to resemble the job of a statistician. At the college or university level, Directors of Assessment now have greatly expanded reports to review, and have the privilege of confirming consistent assessment across the multidimensional rubric-by-rubric-by-rubric spaces.

A worthy enterprise for provost and committee members involved would be to present a specific assessment tool rising to the configuration of the new model’s dimensionality. For the moment, where the assessment wheel actually meets the academic road appears to be unexplored.

Doing More without More:

With a suggested need to assess Essential Skills as an added layer of assessment to already assessed per NMHED guidelines general education competencies, we find an inadequacy in general education being implied. Certainly this inadequacy is alluded to by the phrase “…design general education from the ground up.” Really putting the money where the mouth appears to be would mean enhancing measurement of student performance on the Uber-General General Education Skills known as Essential Skills, which of course means enhancing student performance on these Essential Skills. This would necessitate a de-emphasizing of the importance of course-level skills in order to maintain pedagogical throughput equivalence between old and new assessment frameworks. Failing such equivalence we intend to do more without more, much as we can drive a car further without any need for additional gas.

Type of feedback

  • General Feedback

October 23, 2017

Disclaimer: I reject the notion that New Mexico's General Education framework needs to be redesigned.

Still, hypothetically accepting the notion of redesign leads to a substantial concern regarding the Content Area of Mathematics and its rubrics. A colleague informed me that the New Mexico Higher Education Department itemized the three math courses of College Algebra, Statistics, and Survey of Mathematics some time ago as a means to address the notion of "pathways" -- that is College Algebra is the general education math course of choice for a science, technology, engineering & math pathway, Statistics for (arguably) a health & wellness pathway, and Survey of Mathematics for a liberal arts pathway. By doing so NMHED can at least plausibly be seen as directing curriculum, for its rubrics specify course-level student learning outcomes in each of those three pathway courses. Now under the notion of faculty ownership of curricula, a case could be made that NMHED exercised more authority over the mathematics curriculum than it rightfully should. I do not claim any sort of egregious overstep with disastrous consequences, merely that a case could be made.

If we proclaim that NMHED made no overstep, then I claim a lack of parity in the new general education model between Mathematics and the other Content Areas. Upon inspection of all of the Content Areas, the Mathematics rubrics are the only rubrics in which courses are specifically listed and in which rubric resolution is specifically targeted at the course-level. All other Content Areas are free to choose whichever courses suit the proposed rubric as well as whichever course-level student learning outcomes are relevant. I think for parity and for faculty ownership of curricula, the Mathematics Content Area must be dialed back from its course-level resolution to only a student learning outcome resolution. Therefore there should be only one Mathematics Content Area rubric as is the case for all of the other Content Areas.

Type of feedback

  • Outcomes

Feedback is being provided for the following Outcome Document

Mathematics

October 23, 2017

Many of the skills listed require a proficiency well beyond what would be reasonable for an introductory level, general education course. Often, the skills listed are applicable to very specific courses (e.g, the critical thinking rubric applies to a philosophy course, but not to an introductory algebra course). Add this to the fact that we are often stretched to our limits to meet the current course outcomes; adding more outcomes at the expense of core material, for skills that should be honed in higher level classes, is just a bad idea. And yes, it will be at the expense of core material, because there just aren't enough hours in the course to cover both adequately.

Type of feedback

  • General Feedback

October 20, 2017

I have several comments / concerns about this new set of standards.
1. In many college classes, we struggle to have enough time to cover the basic course outcomes let alone adding additional course objectives such as communication, etc. I weave in the communication skills necessary to function in a quantitative field, but many students have a terrible time dealing with the notation and are barely keeping up with the skills the class requires while communicating in English. What gets cut when we have a snow day or when the teacher is out sick? Class time is stretched very thin as is and emphasizing these new skills in any meaningful way will tax an overburdened schedule to the breaking point.

2. How do we (as faculty) actually implement an assessment that will measure all of these outcomes? Are we to use a set of questions and then evaluate them on a set of 3 or 4 different rubrics? or are we going to develop a series of assessment questions that will each cover one outcome? This will be my final exam now. How do I make a final exam that the students will take in the 1.25 hours I have allocated for the final and still have a breadth of questions that still cover the outcomes that are in the class? It would really help if you could provide an example of what an assessment tool like this would look like. One must design this instrument in such a way that a mediocre student actually has a chance to pass this final in the amount of time allocated. As much as we might wish, not all students will soar in each of our classes, but the Art History major should still have a fighting chance to pass biology 1. Just to be clear, I am asking that a sample final exam be designed with different questions flagged to address different assessments. What will this look like and will it do all of the things we need to address?

3. Once we have collected this data, who is going to analyze it in a meaningful way and communicate the results the faculty? What I am envisioning is a multivariate input vector and a multivariate output vector. I suggest using something like MANOVA (multivariate analysis of variance). Are you going to control for things like different class room environments like: are calculators allowed? Doing this analysis on one variable is complicated and we want to analyze multiple outcomes? I fear that we will be providing a mountain of numbers at great cost to both our time and burdening students with a lot of testing to collect such numbers (see no child left behind programs) and will yield little to no benefit if it is analyzed at all.

4. We are going to redesign the assessment process for every college in the state. Could we at least consider some sort of pilot program from one (or a small group) of colleges or schools at a university to test this process out before changing the system wholesale. Having this pilot program would answer a lot of my concerns -there will be assessment instruments used by actual departments, we can see how the analysis is going to work, etc. All programs have some unforeseen problems, why not test this new system out in a limited environment before moving to it statewide?

Finally, I think these outcomes are wonderful, if lofty ideals. I hesitate to call them goals because goals are measurable like assessments. I don't see how we can measure these outcomes meaningfully nor do i think much thought has been given to the implementation of these ideals to turn them into goals.

Type of feedback

  • General Feedback

October 19, 2017

I am a general education writing instructor and Ph.D. student in English (focusing on Rhetoric and Professional Communication) at New Mexico State University, and I echo many of the concerns regarding these revised General Education (Gen-Ed) skills and outcomes, particularly:

(1) the reduction in required communication credits from 9 to 6, and
(2) what appears to be an overall lack of discussion concerning the alignment of these Gen-Ed skills and outcomes with the Common Course Numbering System (CCNS) Outcomes and Meta-Majors.

(1) Reduction in Required Communication Credits from 9 to 6

I am concerned that we are doing our students a disservice in reducing their opportunities to engage in dedicated writing instruction by instructors who have training and continued professional development in the teaching of writing, considering that (1) the goals of these revised Gen-Ed skills and outcomes involve preparing students for job opportunities and lifelong learning, and (2) written communication skills are highly important for access to and successful completion of post-secondary education, as well as highly ranked in employer surveys such as those conducted by the American Association of Colleges and Universities (AACU) (http://www.aacu.org/sites/default/files/files/LEAP/nchems.pdf) and the National Association of Colleges and Employers (NACE) (http://www.naceweb.org/career-development/trends-and-predictions/job-outlook-2016-attributes-employers-want-to-see-on-new-college-graduates-resumes/).

And while the current plan is for institutions to have 9 discretionary Gen-Ed credits, part of which could be applied toward communication, it is highly possible that not all institutions will choose to apply these credits in this area. So students at some New Mexico (NM) public institutions could have additional opportunities to work on their written communication skills while others do not – a possibility which seems especially problematic to me given, once again, how important these skills are to employers and for post-secondary education, and also considering that these changes to the Gen-Ed curriculum are “to ensure that *all (emphasis added) graduates of New Mexico institutions of higher education have the transferable skills and content knowledge essential for success in the workplace and for lifelong learning,” according to the Statewide General Education Steering Committee’s PowerPoint for NM higher education institutions from October 2017.

(2) More Discussion Needed Concerning Alignment of Gen-Ed Skills and Outcomes with CCNS Outcomes and Meta-Majors

While I understand that Gen-Ed reform is the focus of the conversation here, I have been concerned since attending a Statewide General Education Steering Committee meeting in February 2017 with another graduate student (in place of a committee member with scheduling conflicts) that these Gen-Ed reforms are not being developed in much widespread conversation with the CCNS and Meta-Major initiatives.

No one I asked at the meeting seemed aware of the possible need to align the approved course outcomes from the CCNS courses (which I believe are all Gen-Ed courses) with the Gen-Ed skills and associated outcomes. In fact, these individuals did not seem to be aware that there were outcomes for the CCNS courses as well, and that 80% of the approved outcomes for each CCNS course would have to be adopted by institutions. One committee member who was also a part of a CCNS subcommittee raised a similar question about alignment, and received a response which I felt did not answer the question.

The CCNS was briefly discussed during a Gen-Ed feedback meeting at NMSU on October 17, 2017, but outcome alignment was not mentioned here either. And based on other feedback during that meeting in February, and from what I have read on this site, it does not seem like enough linkages have been made with the Meta-Major effort as well. (Note: From January-August 2016, I was one of several graduate assistants who worked with NMHED on the CCNS initiative, which is why most of my observations have been about CCNS.)

I was reassured during the Q&A for that feedback meeting that outcomes for the approved CCNS courses should align with the Gen-Ed essential skills and their associated outcomes, but I do hope that there is and/or continues to be some awareness at all levels of this Gen-Ed reform effort of how these Gen-Ed changes fit into the broader "trifecta" of reforms (Gen-Ed, CCNS, and Meta-Majors) outlined by Secretary Damron during the webinar she conducted with NMHED in January 2016. I think this increased awareness of how all of the parts fit together will help with streamlining the implementation of all of these reforms for the institutions involved.

Warm Regards,

Kavita Surya
General Education Writing Instructor/Graduate Student
New Mexico State University

Type of feedback

  • Outcomes
  • Models
  • General Feedback

Feedback is being provided for the following Outcome Document

Communication

October 17, 2017

Thanks to the committee for their work on this complicated, important task. Like a number of my colleagues from the English department at NMSU, I have serious concerns about dropping the general education communications requirement from 9 to 6. This would likely mean that many students would take only one writing intensive course (and what I mean here by writing intensive is not just a course that requires significant writing but a course that takes writing as its principal topic of instruction) along with one speech communication course. This places us significantly outside the mainstream with regard to our peer institutions.

With one exception, all of our peer institutions require two first year writing classes (or the AP, CLEP, Int’l Bacc equivalent). About half require in their gen ed curriculum additional writing intensive classes beyond those two first year writing classes. The one exception, Washington State, requires one first year writing class plus multiple, specially designated, upper level writing in the disciplines classes (so the proposed six credits of communication does not parallel Washington State’s gen ed curriculum). Looking beyond our peers, Georgia State, an institution that I know has been followed quite closely by NMSU’s administration, requires two first year writing classes.

NMSU is a land grant university with fairly open admissions requirements (as we should have) in a state ranked very low in secondary education (in saying this, I want to emphasize that I do not blame our hardworking and underpaid high school teachers). This means that we (and the point extends to other institutions in NM) cannot assume our students will arrive sufficiently prepared to succeed in their fields of study with only one writing intensive general education course. The ramifications of such a change could be significant, posing a challenge to (for example) our ability to increase retention and completion rates, an imperative put before all of us by the state legislature.

There are, of course, many other reasons for not reducing general education communications requirements. For example, a recent AACU survey of employers emphasizes, again and again, that analytical writing/critical thinking ability are at or near the top of desirable job skills: http://www.aacu.org/sites/default/files/files/LEAP/nchems.pdf

So I find it a little puzzling that we, unlike all of our peers, seem to believe that our students will succeed with less – rather than more – writing instruction.

I would strongly encourage the committee to retain (at bare minimum) nine credits of required gen ed communications courses.

Thanks,
Ryan Cull
Associate Professor
Director of Undergraduate Studies, English Department
New Mexico State University, Las Cruces

Type of feedback

  • Models

October 17, 2017

Feedback on the Proposed General Education Model and Outcomes

I attended the open forum at UNM on October 3. I did not hear at that session, and I have not heard from faculty at CNM in the School of Math, Science, and Engineering, any concerns about the proposed content areas other than the laboratory science SLOs focus on lab courses. Because labs and theory classes are so frequently across the States HEIs uncoupled (frequently with the lab not required at all in many programs), they should be revised to be general enough to apply to so-called lecture or theory courses, as well (but “so-called” because active learning strategies are gradually transforming those courses into something else besides straight lecture). Generalizing their language to some extent should be a straightforward process and should be done by faculty who teach lower division science classes.

I have over the past few weeks and during the UNM forum heard much discontent about the addition of the essential skills rubrics, discontent expressed in the form of concerns about the implementation of them. I suggest a compromise based on my own understanding of the proposals and the various concerns that have been shared in many discussions I have had with faculty at CNM, especially in the School of MSE. After the suggested compromise immediately below, I offer some of my reasoning behind it. In part, however, what I suggest resides on the notion that we can do more, or at least as much, with somewhat less:

• The proposed model aligns critical thinking essential skills with all content areas. Retain that primary focus. But instead of aligning two more sets of other essential skills, each with three or more outcomes to measure, drop the number from each set to one skill in as a secondary focus for incorporation and assessment.
• Require each HEI to determine at least three (and no more than five?) critical thinking essential skills to be assessed in each general ed course in each content area and one each from the two secondary areas proposed for each content area. But leave it up to the faculty of each HEI to decide the nature and scope of the skills as appropriate to the content area, in which courses those skills will be assessed, and in what way those skills will be assessed, with a goal of having courses certified by whatever the deadline will be. The rubric for math, in other words, I recommend should be called “Critical Thinking Essential Skills for Math’; and “Critical Thinking ES for Creative and Fine Arts” for that content area; and so on. In math, however, at least at CNM, we have general education courses in three pathways, algebra, statistics, and liberal arts math. CNM math faculty can determine what three (to five) critical thinking skills will be assessed as appropriate to each course and which skills in the other two areas. Although I would expect much, if not even complete, commonality in the critical thinking skills to be assessed, I would not be surprised if each course has some differences because the courses have somewhat different goals and objectives. The same, although with probably much more variety, would hold for the secondary areas. A secondary focus of communications skills as determined by math faculty in liberal arts math, statistics, and algebra could be quite different among the three pathways. I could imagine similar differences in the behavioral and social sciences, and probably in all content areas, depending on the secondary foci chosen by their respective faculty.
• Strongly recommend that faculty in each HEI who teach in the various content areas work together across their disciplines to decide on the essential skills for their content areas—for example, physics, chemistry, biology, EPS, and other science faculty should meet together to sort these matters out. Ideally, they should choose the same set of “primary” essential skills to incorporate into their classes, but with modifications appropriate to their individual subjects, as necessary. At minimum, I personally believe they should have at least two common categories between each discipline, but not necessarily the same two for each discipline to assess, with appropriate modifications reflective of the content areas. Ideally, in other words, with three categories to assess, discipline A would assess, for example, categories 1 & 2 and perhaps a third, “X,” not in the rubric issued to guide these discussions; discipline B would assess categories 2 & 3 and “Y,” discipline C would assess 3 & 1 and “Z,” discipline 4 would assess categories 1 & 2 and “J”—and so on, or they can decide some other pattern, so long as two of the three recommendations are covered.
• Strongly recommend faculty in each HEI who shares “secondary” essential skills areas to work together to ensure broad coverage of the essential skills in those areas. Ideally, different content areas should assess for different categories of skills so that students have a broad introduction to different aspects of the “secondary” essential skills that are appropriate to the specific subjects. I am well aware that one of the almost universally cited qualities employers look for in new employees is “creativity” (it seems always to be in the top three listed). I am also aware of at least one theoretical approach that generalizes the creative enterprise across broad fields of endeavor (See Howard E. Gruber’s chapter in Creative People at Work, “The Evolving Systems Approach to Creative Work,” Oxford University Press, 1989). But I also understand that an “evolving system to creative work” in the arts doesn’t work exactly the same way as in the sciences, politics, psychology, technology, internet marketing, or architectural design. Indeed, it varies greatly among individuals within fields. It varies greatly between Google (and within Google, viz. its “X” division) and Sandia National Labs and medical research and theoretical physics. Cross-disciplinary creative enterprise is possible, wonderful, and imperative now—and usually an iteration of difficulty beyond the application of creative skills within a field or discipline, so many introductions to all secondary essential skills seems essential.
[Note: the previous two bullets can lead to a modified version of the notion of “cohorted curricula” mentioned at the UNM forum.]
• Come back in five years or so (or at least two years after the certification deadline) and compare results of SLO assessment and rubric viability by sharing “lessons learned.” This check-in could be handled, perhaps, through the NM HEAR conference that year.

Here follows a summary of my motivation for suggesting the compromise above:

First, they should not require any further editing or adjustments to SLOs already done for the common course numbering project. These recommendations are iterative enough to allow the new sets of SLOs to be tested until the last bullet’s suggestion occurs, after which statewide articulation task forces can convene to edit or tweak the SLOs developed for the CCN project.

At the UNM session, I heard with great interest the claims made several times that the proposed SLOs and their rubrics, in particular the essential skills set, and by inference the content area set, are intended to be guidelines. However, I disagree with that declaration in terms of expectations that the HLC brings to accreditation standards. Once the SLOs and rubrics are adopted, the HLC will expect NM HEIs to assess for them. Today, they can be considered guidelines from an ideal, faculty-driven, governance perspective, but after adoption and in application and results, they are mandates.

So they must be carefully considered in light of the fact that whatever is adopted will become accreditation requirements for assessment and will require, as stated at the session, changes in overall course content and pedagogy.

This latter factor raises more concerns regarding
• the training necessary for faculty in order to assess many of the essential skills in a professionally competent manner;
• the intellectual or experiential levels of many of the outcomes at the proficiency level;
• the numbers of essential skills aligned with content areas;
• and the impact on course content and rigor.

For an example of the concern about training and professionally competent assessment and those two factors’ relationships to content and pedagogy, take the communications skills rubric’s requirement to “identify and develop claims that are supported by evidence and reasoning; evaluate and integrate arguments of others into their own written and spoken arguments” as applied to the mathematics content area. This proficiency recommends a fantastic goal for all students in a liberal arts education. However, it specifies written and spoken argumentation. How will general ed math courses incorporate those two approaches to argumentation? Will math faculty have to require writing across the curriculum approaches and expect a research essay something along the lines of what is typically the last essay assigned in a first semester composition course or any essay in a second semester composition course? Given the content area, a persuasive report or proposal would be even better, a common component of technical writing. Students often don’t take composition right away despite recommended term-by-term degree plans, so what effect will that lack of experience have on student demonstration of the proficiency? Since the “and” says that math courses have to incorporate, as well, some sort of oral rhetorical experience (Quintilian would be proud!), perhaps an oral report or a debate would be necessary. Public speaking is not universally required for the communications content area. How will an instructor of algebra, statistics, or liberal arts math actually incorporate these activities in their courses and maintain credible coverage of the math content itself—in typically 45-48 hours of class meeting time in a semester?

Based on a sample College Algebra syllabus I have seen which incorporates the three essential skills rubrics proposed for the mathematics content area, I think the answer is, “It cannot credibly be done.”

And then there is the assessment factor itself. I have no doubt math faculty know a good written or oral argument when they read or hear it. But the last time most of them probably had any guided or dedicated learning in or training about the rhetorical, grammatical, stylistic, punctuation, or editorial strategies to develop well-rounded written or oral arguments with sound evidence and reasoning and the integration of others’ arguments would have been in their own general education experiences 5-10-20 years or more ago. Even theses and dissertation processes don’t consistently provide that sort of guidance, if at all. Certainly they have employed those skills post-general education and in their working lives, but using skills does not explicitly develop the professionally credible capacity to assess those skills in a manner expected by the HLC or our communities or for the success of students. It is great that faculty across curricula do require essays and oral reports. But many of them outside the composition and rhetoric and public speaking faculty, I know, will balk at assessing, other than vaguely, the standard attributes that go into a good written or oral argument. They might vaguely do so in relation to their courses’ content, but, as they have told me over the years, they are only partially, some have said minimally or not at all, competent to do so for the other areas that go into a proficient written or oral argument. So that issue leads to several questions: How will they become competent enough to assess for proficiency in this essential skill? Do we send them all back to school? Do we require crash-training? How much would either approach cost? How much time will it take? How disruptive will it be to faculty and academic administrators while they maintain everything else that goes into instruction, schedules, professional reviews, morale, employee persistence, and so on?

The previous paragraph may be slightly more applicable to concerns I have for community college faculty than university faculty, in part because CC faculty don’t have research or publication requirements. So their writing and speaking skills generally focus on factors associated with instruction, which is a somewhat narrower and unique milieu in which to practice compositional/technical writing skills and oral skills. Regardless, in my work experiences at four-year HEIs, I cannot say that all four-year institutions’ faculty are great writers or speakers. I know my own weaknesses as a speaker—I am not great at arguing orally, though I do pretty well teaching in front of a class.

Nevertheless, assessing for essential skills proficiencies in general education courses in any HEI in NM requires careful thought regarding the credible assessment of those skills by faculty who may not be as qualified to do so as we might hope, and in some cases may not ever become qualified. What do those ultimately responsible for faculty performance actually do about the latter situation?

As for the level and quantity concerns I listed above, I worry about the essential skills and the essential nature of general education courses. The courses are broadly designed to survey topics in order to introduce students to fields of study. The proposed essential skills are not survey-adaptable. Argumentation in the sciences is not exactly the same as argumentation in the creative arts, nor in the social and behavioral sciences. Counterintuitively, the essential skills as proposed are, in my opinion, not specific enough. I include this comment because the essential skills are responding in part to concerns about preparing graduates for work. As stated in my example above about creativity, I do not agree that there is a common understanding among employers about what represents the successful application of any of the essential skills in their specific companies, industries, or agencies. So I think it is virtually impossible to generically apply these skills across curricula. Instead, they should be taught and assessed in context-specific situations, with students getting as many opportunities as possible to experience different approaches to the introduction of these skills, if not their mastery, so that they become more adept at transferring them within an evolving, if not chaotic, work world post-graduation.

At the UNM meeting, I heard comments that some of the essential skills rubrics contain proficiencies that are better assessed in upper division courses. A response to that concern involved the notion that these are ideals, and that we should not expect all students to achieve them. Even so, I do agree that some of the proficiencies are better assessed at the upper division bachelor’s level in some of the content areas. The example I give above for oral and written arguments in math courses is one of those. I do not think it should be incumbent on first semester, college level algebra, statistics, or liberal arts math faculty to ensure proficiency as currently proposed with that particular essential skills outcome. I hold the same concern about several others assigned to other content areas. However, I do believe a math or statistics bachelor’s degree graduate should be proficient in that essential skill for work in the field of mathematics, and so perhaps bachelor’s degree programs in math and statistics should ensure that particular proficiency in their upper division courses.

Yet that proficiency is a likely candidate for communications courses. Survey courses across the liberal arts and sciences typically start and move students toward the ideal goals of a full and comprehensive liberal arts education, and much less frequently complete them. So faculty must carefully determine how far along the essential skills proficiency paths in each of the content areas they can lead students while ensuring the highest level of credibility and integrity for the course outcomes of their content.

Yet, I do believe that the emerging level in the argumentation category in the communications essential skills rubric should be, and probably ubiquitously is, addressed in first semester, college level math and science courses—as it is in most lower division general ed courses. As such, it is assessable. If it is secondary to critical thinking skills, I believe we can agree that it does not need the same proficiency level or the assessment thereof as the primary skills.

I am very, very concerned about the alignment of three essential skills for each content area. It is simply not feasible, as I stated above in my comment about the sample SLOs for College Algebra. One of the comments at the UNM meeting I heard involved looking at the proposals as a chance to develop “cohorted curricula.” I do like that idea, so I am not arguing against incorporation of some of the proposed essential skills into the proposed content areas. For indeed, one of the emerging trends for future employment of college graduates involves the notion of the hybridization (i.e., the interdisciplinary nature) of work skills and the development of the “gig employee” for the “gig economy,” which requires entrepreneurial and creative skills not typically taught in all general education courses. However, I think that the proposed alignments “over-hybridize” the content and skills: they are too complicated and too many for faculty to incorporate, and would thus be virtually impossible for most students to demonstrate real, competent, credible proficiency for all essential skills as proposed for each content area.

I hope my suggested compromise to the proposed models is helpful and leads to an even better approach. Otherwise, I apologize for the length of this tome, and appreciate your consideration of it.

Submitted Respectfully,

John B. Cornish,
Dean, School of Math, Science, and Engineering
Central New Mexico Community College

Type of feedback

  • General Feedback

October 17, 2017

Feedback on the Proposed General Education Model and Outcomes

I attended the open forum at UNM on October 3. I did not hear at that session, and I have not heard from faculty at CNM in the School of Math, Science, and Engineering, any concerns about the proposed content areas other than the laboratory science SLOs focus on lab courses. Because labs and theory classes are so frequently across the States HEIs uncoupled (frequently with the lab not required at all in many programs), they should be revised to be general enough to apply to so-called lecture or theory courses, as well (but “so-called” because active learning strategies are gradually transforming those courses into something else besides straight lecture). Generalizing their language to some extent should be a straightforward process and should be done by faculty who teach lower division science classes.

I have over the past few weeks and during the UNM forum heard much discontent about the addition of the essential skills rubrics, discontent expressed in the form of concerns about the implementation of them. I suggest a compromise based on my own understanding of the proposals and the various concerns that have been shared in many discussions I have had with faculty at CNM, especially in the School of MSE. After the suggested compromise immediately below, I offer some of my reasoning behind it. In part, however, what I suggest resides on the notion that we can do more, or at least as much, with somewhat less:

• The proposed model aligns critical thinking essential skills with all content areas. Retain that primary focus. But instead of aligning two more sets of other essential skills, each with three or more outcomes to measure, drop the number from each set to one skill in as a secondary focus for incorporation and assessment.
• Require each HEI to determine at least three (and no more than five?) critical thinking essential skills to be assessed in each general ed course in each content area and one each from the two secondary areas proposed for each content area. But leave it up to the faculty of each HEI to decide the nature and scope of the skills as appropriate to the content area, in which courses those skills will be assessed, and in what way those skills will be assessed, with a goal of having courses certified by whatever the deadline will be. The rubric for math, in other words, I recommend should be called “Critical Thinking Essential Skills for Math’; and “Critical Thinking ES for Creative and Fine Arts” for that content area; and so on. In math, however, at least at CNM, we have general education courses in three pathways, algebra, statistics, and liberal arts math. CNM math faculty can determine what three (to five) critical thinking skills will be assessed as appropriate to each course and which skills in the other two areas. Although I would expect much, if not even complete, commonality in the critical thinking skills to be assessed, I would not be surprised if each course has some differences because the courses have somewhat different goals and objectives. The same, although with probably much more variety, would hold for the secondary areas. A secondary focus of communications skills as determined by math faculty in liberal arts math, statistics, and algebra could be quite different among the three pathways. I could imagine similar differences in the behavioral and social sciences, and probably in all content areas, depending on the secondary foci chosen by their respective faculty.
• Strongly recommend that faculty in each HEI who teach in the various content areas work together across their disciplines to decide on the essential skills for their content areas—for example, physics, chemistry, biology, EPS, and other science faculty should meet together to sort these matters out. Ideally, they should choose the same set of “primary” essential skills to incorporate into their classes, but with modifications appropriate to their individual subjects, as necessary. At minimum, I personally believe they should have at least two common categories between each discipline, but not necessarily the same two for each discipline to assess, with appropriate modifications reflective of the content areas. Ideally, in other words, with three categories to assess, discipline A would assess, for example, categories 1 & 2 and perhaps a third, “X,” not in the rubric issued to guide these discussions; discipline B would assess categories 2 & 3 and “Y,” discipline C would assess 3 & 1 and “Z,” discipline 4 would assess categories 1 & 2 and “J”—and so on, or they can decide some other pattern, so long as two of the three recommendations are covered.
• Strongly recommend faculty in each HEI who shares “secondary” essential skills areas to work together to ensure broad coverage of the essential skills in those areas. Ideally, different content areas should assess for different categories of skills so that students have a broad introduction to different aspects of the “secondary” essential skills that are appropriate to the specific subjects. I am well aware that one of the almost universally cited qualities employers look for in new employees is “creativity” (it seems always to be in the top three listed). I am also aware of at least one theoretical approach that generalizes the creative enterprise across broad fields of endeavor (See Howard E. Gruber’s chapter in Creative People at Work, “The Evolving Systems Approach to Creative Work,” Oxford University Press, 1989). But I also understand that an “evolving system to creative work” in the arts doesn’t work exactly the same way as in the sciences, politics, psychology, technology, internet marketing, or architectural design. Indeed, it varies greatly among individuals within fields. It varies greatly between Google (and within Google, viz. its “X” division) and Sandia National Labs and medical research and theoretical physics. Cross-disciplinary creative enterprise is possible, wonderful, and imperative now—and usually an iteration of difficulty beyond the application of creative skills within a field or discipline, so many introductions to all secondary essential skills seems essential.
[Note: the previous two bullets can lead to a modified version of the notion of “cohorted curricula” mentioned at the UNM forum.]
• Come back in five years or so (or at least two years after the certification deadline) and compare results of SLO assessment and rubric viability by sharing “lessons learned.” This check-in could be handled, perhaps, through the NM HEAR conference that year.

Here follows a summary of my motivation for suggesting the compromise above:

First, they should not require any further editing or adjustments to SLOs already done for the common course numbering project. These recommendations are iterative enough to allow the new sets of SLOs to be tested until the last bullet’s suggestion occurs, after which statewide articulation task forces can convene to edit or tweak the SLOs developed for the CCN project.

At the UNM session, I heard with great interest the claims made several times that the proposed SLOs and their rubrics, in particular the essential skills set, and by inference the content area set, are intended to be guidelines. However, I disagree with that declaration in terms of expectations that the HLC brings to accreditation standards. Once the SLOs and rubrics are adopted, the HLC will expect NM HEIs to assess for them. Today, they can be considered guidelines from an ideal, faculty-driven, governance perspective, but after adoption and in application and results, they are mandates.

So they must be carefully considered in light of the fact that whatever is adopted will become accreditation requirements for assessment and will require, as stated at the session, changes in overall course content and pedagogy.

This latter factor raises more concerns regarding
• the training necessary for faculty in order to assess many of the essential skills in a professionally competent manner;
• the intellectual or experiential levels of many of the outcomes at the proficiency level;
• the numbers of essential skills aligned with content areas;
• and the impact on course content and rigor.

For an example of the concern about training and professionally competent assessment and those two factors’ relationships to content and pedagogy, take the communications skills rubric’s requirement to “identify and develop claims that are supported by evidence and reasoning; evaluate and integrate arguments of others into their own written and spoken arguments” as applied to the mathematics content area. This proficiency recommends a fantastic goal for all students in a liberal arts education. However, it specifies written and spoken argumentation. How will general ed math courses incorporate those two approaches to argumentation? Will math faculty have to require writing across the curriculum approaches and expect a research essay something along the lines of what is typically the last essay assigned in a first semester composition course or any essay in a second semester composition course? Given the content area, a persuasive report or proposal would be even better, a common component of technical writing. Students often don’t take composition right away despite recommended term-by-term degree plans, so what effect will that lack of experience have on student demonstration of the proficiency? Since the “and” says that math courses have to incorporate, as well, some sort of oral rhetorical experience (Quintilian would be proud!), perhaps an oral report or a debate would be necessary. Public speaking is not universally required for the communications content area. How will an instructor of algebra, statistics, or liberal arts math actually incorporate these activities in their courses and maintain credible coverage of the math content itself—in typically 45-48 hours of class meeting time in a semester?

Based on a sample College Algebra syllabus I have seen which incorporates the three essential skills rubrics proposed for the mathematics content area, I think the answer is, “It cannot credibly be done.”

And then there is the assessment factor itself. I have no doubt math faculty know a good written or oral argument when they read or hear it. But the last time most of them probably had any guided or dedicated learning in or training about the rhetorical, grammatical, stylistic, punctuation, or editorial strategies to develop well-rounded written or oral arguments with sound evidence and reasoning and the integration of others’ arguments would have been in their own general education experiences 5-10-20 years or more ago. Even theses and dissertation processes don’t consistently provide that sort of guidance, if at all. Certainly they have employed those skills post-general education and in their working lives, but using skills does not explicitly develop the professionally credible capacity to assess those skills in a manner expected by the HLC or our communities or for the success of students. It is great that faculty across curricula do require essays and oral reports. But many of them outside the composition and rhetoric and public speaking faculty, I know, will balk at assessing, other than vaguely, the standard attributes that go into a good written or oral argument. They might vaguely do so in relation to their courses’ content, but, as they have told me over the years, they are only partially, some have said minimally or not at all, competent to do so for the other areas that go into a proficient written or oral argument. So that issue leads to several questions: How will they become competent enough to assess for proficiency in this essential skill? Do we send them all back to school? Do we require crash-training? How much would either approach cost? How much time will it take? How disruptive will it be to faculty and academic administrators while they maintain everything else that goes into instruction, schedules, professional reviews, morale, employee persistence, and so on?

The previous paragraph may be slightly more applicable to concerns I have for community college faculty than university faculty, in part because CC faculty don’t have research or publication requirements. So their writing and speaking skills generally focus on factors associated with instruction, which is a somewhat narrower and unique milieu in which to practice compositional/technical writing skills and oral skills. Regardless, in my work experiences at four-year HEIs, I cannot say that all four-year institutions’ faculty are great writers or speakers. I know my own weaknesses as a speaker—I am not great at arguing orally, though I do pretty well teaching in front of a class.

Nevertheless, assessing for essential skills proficiencies in general education courses in any HEI in NM requires careful thought regarding the credible assessment of those skills by faculty who may not be as qualified to do so as we might hope, and in some cases may not ever become qualified. What do those ultimately responsible for faculty performance actually do about the latter situation?

As for the level and quantity concerns I listed above, I worry about the essential skills and the essential nature of general education courses. The courses are broadly designed to survey topics in order to introduce students to fields of study. The proposed essential skills are not survey-adaptable. Argumentation in the sciences is not exactly the same as argumentation in the creative arts, nor in the social and behavioral sciences. Counterintuitively, the essential skills as proposed are, in my opinion, not specific enough. I include this comment because the essential skills are responding in part to concerns about preparing graduates for work. As stated in my example above about creativity, I do not agree that there is a common understanding among employers about what represents the successful application of any of the essential skills in their specific companies, industries, or agencies. So I think it is virtually impossible to generically apply these skills across curricula. Instead, they should be taught and assessed in context-specific situations, with students getting as many opportunities as possible to experience different approaches to the introduction of these skills, if not their mastery, so that they become more adept at transferring them within an evolving, if not chaotic, work world post-graduation.

At the UNM meeting, I heard comments that some of the essential skills rubrics contain proficiencies that are better assessed in upper division courses. A response to that concern involved the notion that these are ideals, and that we should not expect all students to achieve them. Even so, I do agree that some of the proficiencies are better assessed at the upper division bachelor’s level in some of the content areas. The example I give above for oral and written arguments in math courses is one of those. I do not think it should be incumbent on first semester, college level algebra, statistics, or liberal arts math faculty to ensure proficiency as currently proposed with that particular essential skills outcome. I hold the same concern about several others assigned to other content areas. However, I do believe a math or statistics bachelor’s degree graduate should be proficient in that essential skill for work in the field of mathematics, and so perhaps bachelor’s degree programs in math and statistics should ensure that particular proficiency in their upper division courses.

Yet that proficiency is a likely candidate for communications courses. Survey courses across the liberal arts and sciences typically start and move students toward the ideal goals of a full and comprehensive liberal arts education, and much less frequently complete them. So faculty must carefully determine how far along the essential skills proficiency paths in each of the content areas they can lead students while ensuring the highest level of credibility and integrity for the course outcomes of their content.

Yet, I do believe that the emerging level in the argumentation category in the communications essential skills rubric should be, and probably ubiquitously is, addressed in first semester, college level math and science courses—as it is in most lower division general ed courses. As such, it is assessable. If it is secondary to critical thinking skills, I believe we can agree that it does not need the same proficiency level or the assessment thereof as the primary skills.

I am very, very concerned about the alignment of three essential skills for each content area. It is simply not feasible, as I stated above in my comment about the sample SLOs for College Algebra. One of the comments at the UNM meeting I heard involved looking at the proposals as a chance to develop “cohorted curricula.” I do like that idea, so I am not arguing against incorporation of some of the proposed essential skills into the proposed content areas. For indeed, one of the emerging trends for future employment of college graduates involves the notion of the hybridization (i.e., the interdisciplinary nature) of work skills and the development of the “gig employee” for the “gig economy,” which requires entrepreneurial and creative skills not typically taught in all general education courses. However, I think that the proposed alignments “over-hybridize” the content and skills: they are too complicated and too many for faculty to incorporate, and would thus be virtually impossible for most students to demonstrate real, competent, credible proficiency for all essential skills as proposed for each content area.

I hope my suggested compromise to the proposed models is helpful and leads to an even better approach. Otherwise, I apologize for the length of this tome, and appreciate your consideration of it.

Submitted Respectfully,

John B. Cornish,
Dean, School of Math, Science, and Engineering
Central New Mexico Community College

Type of feedback

  • Models

October 16, 2017

As a former public school advisor and now NMSU English instructor, I have witnessed the erosion in the critical thinking and communications curricula at the secondary level. Many of our students now enter college with debilitating deficits in their thinking (and thus communicating) basic abilities and knowledge and, without additional bolstering at the university level, become statistics in our lack of retention rates.

An English 111 (freshman) students' insights into her own lack of preparation should, I believe, be considered in the proposed changes in the post-secondary gen-ed communication requirements lest we, as educators, perpetuate their cookie-cutter, true/false, multiple-choice-correct-answer, naive thinking. The following is a direct quote, with her permission, in response to a recent (October 2017) assignment:

"After reading the essay on 'Junk Science', I have to say that I strongly agree with the author, Lee Ann Fisher. High School educators are no longer teaching students how to be critical thinkers. Students are not learning the necessities that they will be needing in college and in future careers. Teenagers in high school no longer retain information. Instead, students learn enough about a topic to make it past the next test. This has caused the average ACT and SAT scores to drop over 75 points. High school exams are now written at an 8th grade level and elementary math is made up mostly by coloring. We have lowered the standards for our students and then wonder why 'great minds' are so far and few between."

Another reading in this 111 class asks students to analyze, critique and question information, going "beyond the common high school 5-paragraph essay." Three students who graduated from the same high school in Las Cruces asked what the author meant. As we talked, it was revealed that they had not written anything as long as an essay since their sophomore year.

Thus, at the college level, we must not only teach students to think, read, write, research and communicate in preparation for an ever-evolving workforce, we must also reach back to give them the foundational knowledge they need to begin to function as educated people and even be those "great minds" who contribute to and improve their global societies.

In my 200-level English classes, Technical and Scientific Communication and Business and Professional Communications, I offer the following video, "7 Skills Students Need for their Future" by Dr. Tony Wagner, director of Harvard's Change Leadership Group and author of "The Global Achievement Gap." Critical thinking/problem solving and effective oral and written communication skills are two of the seven skills. New Mexico's students are often disadvantaged for a variety of reasons and backgrounds. Let's not cripple them further with inadequate preparation in these necessary skill sets. Here is the link to Dr. Wagner's video: https://www.youtube.com/watch?v=NS2PqTTxFFc.

Thank you very much for soliciting our opinions.

Type of feedback

  • Outcomes

Feedback is being provided for the following Outcome Document

Communication

October 16, 2017

The proposed reduction of general education communications credits poses a serious threat to the quality of education we can offer to students in New Mexico. As the Writing Program Administrator, I am responsible for all general education writing at NMSU. I know how hard it is to train writing teachers; it requires dedicated time, professional development, and resources that we have carefully cultivated in the Writing Program. I am alarmed by the implication that students can do more with a reduction of credits. Student writing and teacher experiences tell us that our students need greater opportunities to practice writing so that they can develop as engaged problem solvers, strong students, and eventually as leaders in their fields. Last winter, the English department had an external program review. The reviewers’ report spoke very favorably about our writing program and its potential to expand. To grow as a program that serves students across the university, we begin by serving our students in general education courses. We need to require, not just suggest, 9 credits of general education devoted to communications. I am also a member of NMSU’s Quality Initiative (QI), a project that has brought a Writing Within the Disciplines focus to the university. The goal of the initiative was to bring attention to writing as a vehicle for thinking and learning at all stages of education. Reducing communications credits reduces this possibility and opposes the work that has made the QI a success.

Lauren Rosenberg
Writing Program Administrator
Chair of the English Department General Education Committee
Member of NMSU's Quality Initiative

Type of feedback

  • General Feedback

October 9, 2017

Observation: It's challenging to write learning outcomes and rubrics that are general enough to apply in a wide variety of courses and to students from a diverse collection of majors.

Solution: Start by creating meta-majors and work from there.

Step 1: Decide on several different meta-majors: STEM, Social/Behavioral Sciences, Humanities, Health Sciences, ... There's evidence that meta-majors enhance transfer and help prepare students for future coursework. This is already a HED project and will require a redesign of general education learning outcomes.

Step 2: Decide on Skills or Learning Outcomes that all students in the meta-major should have mastered after completing the general education core classes. This is much easier and more meaningful than trying to use the same rubric for a future physics major and a future art history major. Yes, we want them both to develop quantitative reasoning skills, but trying to craft a rubric that applies to both makes the rubric useless for either one.

Step 3: Create each meta-major by choosing a subset of general education classes that will deliver the specific learning outcomes from Step 2.

It's meaningful to decide that all future STEM majors should have some particular level of quantitative reasoning skills after taking 30 credits of general education classes. It's also meaningful to decide that future humanities majors should have some particular level of quantitative reasoning skills after taking 30 credits of general education classes.

I don't think it's meaningful to try to write a rubric for quantitative analysis that works for both STEM and humanities majors. Once a rubric is written broadly enough to apply to both students in widely different classes, it's tough to see what applying the rubric can tell us.

Type of feedback

  • General Feedback

October 9, 2017

My opinion:
From skill models, I pick Communications from the Content area and as its skills I choose communication critical thinking.
From the content rubric, I pick ethical reasoning and collaboration skills, teamwork, and value systems (although I’m not sure what value systems mean).
We measure ethical reasoning extensively through speeches in 1130. The speaker’s credibility through the use of good credible sources, ethical evidence, and use of proper language. However, students do not compare a range of ethical perspectives and propose an ethical solution based on one of more perspectives. That gets addressed in an advance public address class. Or it’s an area that gets covered in philosophy courses.
Also, Teamwork and collaboration can be in one place, but value systems should perhaps be its own category.
Example, a group activity consisting of 5 students can be measured by making sure all 5 are working together and if one is not, then that’s quantitative measure. Say, from every group 3 to 5 students, 1 doesn’t do the work.
When it comes to collaboration skills, teamwork, group members are encouraged to be respectful of each other and their worldly/” intercultural” views. I have my students complete a sheet named “Ethical Listening Commandments” and “Ethical Speaking Commandments” They spend 45 minutes writing guidelines together and once approved by me, they are to abide by them in public speaking and group communication classes/projects. The level of success with this has been qualitatively very good.
I do have a comment on Assessment suggestions in general and that is besides doing exams, projects, portfolios, papers, maybe we can add informal feedback once in the middle of every semester, so we can change things around upon students’ requests. Also at the end of most classes, we can ask for paragraph summaries and muddies points for that day.

Type of feedback

  • Models

October 7, 2017

I am a full-time instructor at CNM in Philosophy and General Honors.

Overview: My general impression of the new "Skills Model" is positive, especially if it leads to a simplification of the outcomes assessments. No matter how much I wish that the state could truly see the full value of what we teach reflected in these assessments, I think it is valuable to see that what we teach also leads to direct improvements in these general skills. To focus on my own discipline, students trained in philosophy often show outstanding abilities in a wide range aptitudes, reflected in their test scores across the GRE, on the LSAT, GMAT, and even the MCAT. But I hope that this new model does not mean that my courses are valuable only because they enhance these skills. My classes, when I teach well, enhance the lives of my students, helping them to live lives that are satisfying and meaningful to them and that improve their society.

My concerns with assessments over the last decade: These attempts to translate the value of what we teach into terms that can be understood by people outside of teaching and learning will always be frustrating. The models have been improving but still give us only a snapshot of a student's skills, instead of revealing correlations that can be used to judge the effectiveness of a teaching strategy, class, department, school, or curriculum. The scores we see could just as well reflect the student's innate skill set, instead of what they have gained in our classes. What is needed is the ability to evaluate the improvements of students in these skills. Now, of course, it will be difficult to insure they are measured in consistent ways, but, at the very least, we need to try to find data that can detect, for example, improvement in "Critical Thinking" skills across the student's time at the college. Then we can compare these improvements to see if there is a meaningful correlation between a student's "Critical Thinking" skill and what classes that student took. Our assessments have, instead, seen constant changes that have rendered them unable to find even minimally interesting correlations. Even a very flawed assessment tool is more effective if we can use the some one for a couple of years so that these correlations across time could emerge.

My concerns with the latest drafts of this model and its assessment tools: I think the general skills as titles are workable, but some of the rubrics provided are too specific. Whereas the "Communication" rubric uses language that can be applied in a wide range of classes, the rubric for "Personal and Social Responsibility," a skill that is the very bread and butter of a course like "Ethics and Society," is worded in such a way that only an anthropology class would fit most of its criteria directly. This is the very thing that I supposed a "Skills" approach would avoid. We need rubrics that address the skill, not the content.

My concerns as this relates to Humanities and Philosophy: The current model asks us to pick two skills from a list of "Critical Thinking," "Personal and Social Responsibility," and "Information and Digital Literacy." The "Critical Thinking" skill is a good fit, and the rubric there is fine. I am having difficulty with the others. First, I am unsure why our choices do not include "Communication," when our classes include some of the more advanced reading and writing students will find in college. Second, as stated above, the "Personal and Social Responsibility" skill rubric is written in such a way that our classes, which address that skill directly in terms of critically analyzing ideologies, worldviews, and the like does not fit the rubric's anthropological slant. The application of that rubric to a "Logic" class, for example, would be a stretch.

Thank you for your time.

Type of feedback

  • General Feedback

October 6, 2017

Thank you for requesting my feedback. I teach Intro to Philosophy at Central New Mexico Community College.

My course can easily meet all of the outcomes listed for Humanities, as well as providing students with a chance to practice much-needed skills in the areas of Personal & Social Responsibility and Critical Thinking. I believe my course also helps students sharpen Communication and Information & Digital Literacy skills.

I have tailored my class to first help students evaluate and improve their critical thinking skills, then use those skills to analyze and criticize the major tenets of Western thought. Students must compare their own ideas to the philosophers' ideas, and consider how or whether those Western concepts have contributed to the societal and personal problems in their own lives. They also consider whether Western concepts can help solve those problems.

This process allows students to complete all three of the Humanities Content Knowledge requirements effectively. I believe a similar approach should be used in all Intro to Philosophy classes.

Also, I believe a college-level Intro to Philosophy class should be a graduation requirement for all high school and college students in New Mexico. We must examine and criticize Western thought to solve the problems this point of view has caused.

Type of feedback

  • Outcomes

Feedback is being provided for the following Outcome Document

Humanities

October 4, 2017

This is so vague and involved that it is impossible to apply it to what we do in general education chemistry classes. Science involves a tremendous amount of critical thinking, but this rubric sounds like a big research paper. Evaluating data in a lab report is also an effective way to develop critical thinking, so I'd like a rubric that could be used in this setting as well. See the proposed changes below.

Skill, Analysis: Connect pieces of information together in order to determine what the intended meaning or application of the information.
Developing: Use various equations and models to solve problems and answer questions.

Skill, Inference: Understand and recognize what elements you will need in order to determine an accurate conclusion or hypothesis from the information you have. Developing: Form a hypothesis or conclusion from data collected by the student or presented in a report by another person.

Skill, Evaluation: Evaluate the credibility of statements or descriptions in order to measure the validity of the information being presented.
Developing: Identify reasonable and unreasonable results from data and/or conclusions.

Type of feedback

  • Outcomes

Feedback is being provided for the following Outcome Document

Critical Thinking

October 4, 2017

We address this area is general education chemistry classes, especially sustainability. We describe the processes of producing electricity from fossil fuel and nuclear power and their impact on the environment. These include “collaboration skills” and “civic discourse”, but these skills are not the focus of the learning, and they are impossible to measure.

Here are some measureable outcomes we could apply to chemistry:
1. Describe the process of changing fossil fuel to electricity.
2. Describe the process of producing electricity from nuclear reactions.
3. Calculate the amount of carbon dioxide produced form burning a quantity of fuel
4. List pros and cons of various methods of producing electricity.

These would be appropriate for the "sustainability" section, but it is impossible to measure the "collaboration" and "civic discourse" sections.

Type of feedback

  • Outcomes

Feedback is being provided for the following Outcome Document

Personal and Social Responsibility

October 4, 2017

Essential skills: Quantitative Reasoning:
I would prefer the focus be on using mathematical equations in contextual problems, applying the correct equation for the given situation, and calculating correctly. The first skill addresses this, but the next 2 skills focus more on evaluating the validity of the equation. Most of the equations we use are not at all controversial, and if the data is valid and the calculations are correct, the results will be valid.

Proposed changes:

Analysis, Proficient: Students can choose the correct mathematical equation or model to analyze a specific situation or set of data.

Application, Proficient: Students can use mathematical equations and models to make predictions and conclusions. They can describe the limitations and assumptions for the particular models they use.

Type of feedback

  • Outcomes

Feedback is being provided for the following Outcome Document

Quantitative

October 4, 2017

Comments on Proposed General Ed science requirements:
In the big picture, I see general educations as giving students tools. I think we should evaluate if they know how to use these tools, not what they decide to build with these tools. I feel the proposed rubrics address what they are building with the tools instead of if they can use the tools.
In chemistry class, we teach many new concepts and mathematical equations that describe matter and what is going on with matter in the world. Students leave the class with a new understanding of how the matter around them behaves. I want to assess their ability to use these concepts, not how they apply these concepts to formulate opinions about current politically charged scientific ideas.
I would like to use multiple choice exam questions as an option for assessment. These can be graded easily and objectively, they are manageable for groups of hundreds of students, and they are used in many professional settings.
This would be so much easier to apply if we had a short list of measureable outcomes to assess.

Scientific Literacy Proposed: Students describe the relevance of scientific concepts and processes required for personal decision making, participation in civic and cultural affairs, and economic productivity; students read, evaluate and can effectively analyze the validity of scientific arguments from the popular press. This is complicated, tainted by personal experience and bias, and impossible to evaluate. Educated people often disagree on which scientific arguments are the most valid, and I don’t want to be grading students on their opinions.

Change: Students can describe the scientific concepts presented in this class and how they relate to their experiences in the world. The specific concepts the students learn will vary depending on the class.

Scientific Reasoning Proposed: Students contrast scientific explanations for natural phenomena from other ways of knowing or arriving at conclusions and judgments; explain that scientific understanding is tentative and subject to falsification. This is also complicated and impossible to evaluate. Basic science classes should teach scientific explanations and not spend time on other ways of knowing. When students understand the scientific method, it will be obvious that scientific understanding can change as new experimental evidence is gained, but this does not need to be the focus of the discussion.

Change: Students can describe the scientific method, including that scientific understanding is based on hypothesis and experimentation, and that theories and laws are founded on substantial experimental evidence.

Experimental Techniques, Methods and Design Proposed: Students execute appropriate experimental designs; produce visual and tabular representations of scientific data; apply simple statistical descriptors to characterize experimental data sets. This works for me.

Type of feedback

  • Outcomes

Feedback is being provided for the following Outcome Document

Science

October 2, 2017

Essential Skill: Information and Digital Literacy
• This rubric seems to be trying to combine skills covered in most Content Area while removing all reference to content. Largely about how to evaluate information and use it to do effective research.
• Digital Literacy, computer literacy …. This looks like an afterthought shoved into row 2 of the Information Literacy rubric. It’s too broad and ill defined. It could mean anything. What do you actually want our AA/AS students to be able to do on a computer? An Assessment suggestion is a “typing test.” I value typing as a skill set but I don’t consider a typing test to be a reflection of digital literacy and I would not be happy as a parent or business owner if that was the standard.
• Based on the 3 out of 4 components skill explanation in the rubric, would students be able to take courses for this Essential Skill that only covered rows 1, 3, and 4 and NEVER have to demonstrate proficiency in using computers or digital media? How would this be enforced?
• As the parent of a soon-to-be college student, Information Literacy is a very important skill. Another very important skill is Digital Literacy. Although the kids are using lots of computers in K-12 nowadays they do not understand how to properly use basic programs like Word, Excel to create college level assignments. They do better with Powerpoint. They definitely do not understand how to be safe online or evaluate critically information found online and on social media.
• I’d recommend separating Information Literacy from Digital Literacy. They are two separate ideas. They can be done together or they can be done separately. By pushing them together in this rubric, digital literacy gets short-shrifted and courses that would focus primarily on Information Literacy get forced to teach and assess basic digital literacy.

Type of feedback

  • Outcomes

Feedback is being provided for the following Outcome Document

Information Literacy

October 2, 2017

Essential Skill: Quantitative Reasoning
• States that Gen Ed student should demonstrate Proficiency on ALL 3 components  Proficiency components are BA/BS level skills NOT sophomore level AA/AS skills and definitely NOT freshman level AA/AS.
• Freshman level non-major biology courses do not teach mathematics nor do they have the time to do so if they are to cover their Content Areas. A freshman level non-major biology course could satisfy the Emerging level of the Essential Skill Quantitative Reasoning rubric and still teach its content but the Developing and Proficiency levels would require that students have ALREADY demonstrated proficiency in their college level math courses which is not a reality because the vast majority of freshman and sophomores are either taking their science and math at the same time OR taking their science BEFORE they take their math course.
o If science courses need to demonstrate Quantitative Reasoning Essential Skills at the Proficiency level then ALL freshman level science courses will need to change their pre-requisites to include college level algebra. This would make it impossible for degree term-by-terms to satisfy accreditation time-to-completion requirements.

Type of feedback

  • Outcomes

Feedback is being provided for the following Outcome Document

Quantitative

October 2, 2017

Content Area: Lab Sciences
• For AA/AS degree, the Developing level of the rubric is reasonable for non-major’s biology courses.
• The separation of lab and lecture skills needs to be explained in the rubric introduction. The rubric does not explain how “lab” skills are different from science “lecture” materials. The majority of schools in NM now offer lecture and lab courses separately. Their content and grades are no longer tied together as a single 4 credit course. We do not use lab equipment in lecture courses. Will lecture courses need to demonstrate lab skills and will lab courses need to demonstrate scientific literacy skills?
• Scientific Literacy
o Redundancy  at developing level duplicates what is in the Personal & Social Responsibility Essential Skill, Critical Thinking Essential Skill
• Experimental Techniques
o If Quantitative Reasoning Essential Skill is included in 1000/2000 level Gen Ed science courses – it is in opposition to the level of quant reasoning expected in the Developing level of the Science rubric.

Type of feedback

  • Outcomes

Feedback is being provided for the following Outcome Document

Science

October 2, 2017

Summary of Critiques (detailed explanations follow):
• The 2 tier model of Content Areas & Essential Skills does not streamline curriculum but rather is a more complicated version of the same thing.
• No students are listed as members of the NMS Gen Ed revision committee. Where are the student contributions to this process?
• How is this proposed model a new “from the ground up” model? Looking at the Content Areas and the credit hours, it looks essentially like the old model with several Essential Skills rubrics added on top.
• I do not understand how the additional 9 credit hours (3 courses) of Gen Ed will work. It states that these courses can come from outside of the Content Areas listed in the model but that each of the courses must be from a different content area and focus on two or more essential skills.
• Does this mean those courses will need to be listed on the Gen Ed matrix even if they don’t fall under one of the classic content areas? Would it matter if all 3 classes only cover the same two essential skills?
• Essential Skill Model
• complicates transfer courses and the demonstration of transfer student competencies
• is inconsistent in how it is structured regarding proficiency expectations and the difference between competency at the AA/AS level vs BA/BS level
• is Redundant with all Content Areas – Therefore, embed the essential skills
• is Redundant for Communications and Math Content Areas as the Communication and Quant essential skills are already embedded
• removes choice from Math and Science Content Areas
• Assignments of Essential “Skills considered to be closely associated with the content area” does not make sense.
• Content Area Lab Sciences:
• rubric needs to address separation of lab and lecture course components
• Quantitative Skills Essential Skills Proficiency level is not possible in a freshman/sophomore level science content course
• redundancy with Social/Personal Responsibility and Critical Thinking Essential skills rubrics – if redundant then embed
• Faculty and Student Workload is increased – model would increase, for example, Science assessable competencies from about 5 to a minimum of 10. This is in addition the detailed course level learning outcomes that are also assessed and, with the new State course numbering system, already largely standardized across NM institutions.
• Is it “Creative and Fine Arts” or “Creative and Performing Arts” Content Area? Will art history courses be subsumed under Humanities on the Gen Ed matrix? Should Content area “Humanities” really be “Humanities and Fine Arts” or “Humanities and Art Appreciation?”
• Based on the rubric, how is the Humanities Content Area distinctive from the Social/Behavioral Sciences Content Area?
• The NMSU websites does not clarify if that the Communications Essential Skill and Content Area appear to now be collapsed into a single document/rubric?
• Essential Skill Area: Information Literacy and Digital Literacy  I’d recommend separating Information Literacy from Digital Literacy. They are two separate ideas and putting them together short-shrifts one and hamstrings the other.

My Recommendations:
1) Return to the old model and clean up the language to better reflect the embedded essential skills components, OR
2) Remove ALL Content Area rubrics, keep the model showing credit hours by Content Area, use only the Essential Skills rubrics, and ask each content area course to cover a certain number of Essential Skills, AND
a. With the common course numbering system, course learning outcomes must match 70-80% between schools for transferability. Therefore, if these courses have the same course number and are on the Gen Ed matrix they are covering enough of the same material then it is redundant to ask for proof of outcome coverage in State assessment reports.
3) Establish a large-scale survey of NM faculty to rank where they expect their students to be on a rubric at the end of their Gen Ed course, AND
4) Establish a single person or small team as redactors who re-write the model and rubrics to have the same voice and structure and expectation levels for AA/AS Gen Ed and, perhaps, BA/BS Gen Ed expectations. Use the survey data to better refine what should be considered “proficient” at the AA/AS Gen Ed 1000/2000 course level.

Model
• The 2 tier model of Content Areas & Essential Skills does not streamline curriculum but rather is a more complicated version of the same thing.
• I compared the components in the current NMHED Area III: Laboratory Sciences, proposed Content Area: Science, and the proposed Essential Skills for Quantitative Reasoning, Critical Thinking, Communication, and Personal/Social Responsibility
 The current Area III Lab Science Competency covers the same proposed new content areas and hits on all the Essential Skill areas. It appears that the new proposed model separates the Content Area from some of the Essential Skills but not all. The Communication and Quant Reasoning are mostly removed from the proposed Content Area and put into a separate rubric. The Quantitative Reasoning Essential Skill rubric is exceedingly different from the current math expectations of science Gen Ed courses. Additionally, since Communication is completely removed as a skill from the Science Content Area rubric and as an expected Essential Skill it appears to no longer be a relevant part of Gen Ed science other than students should be able to “read …arguments from the popular press.”
• The new model is basically the same thing we currently have but now it separates content and skills and makes it more difficult to track and evaluate.

• I do not understand how the additional 9 credit hours (3 courses) of Gen Ed will work. It states that these courses can come from outside of the Content Areas listed in the model but that each of the courses must be from a different content area and focus on two or more essential skills.
• Does this mean those courses will need to be listed on the Gen Ed matrix even if they don’t fall under one of the classic content areas? Would it matter if all 3 classes only cover the same two essential skills?

• Essential Skill Model complicates transfer courses and the demonstration of transfer student competencies
• I mapped out courses for examples of possible Liberal Arts degrees and it was very easy to make schedules where students didn’t have any courses that demonstrated one Essential Skill area. It was also quite easy to make a schedule where a student at College A would have Gen Ed courses all carefully crafted to hit the content areas and the essential areas and then transfer to College B where they would be missing Essential Skill areas because College B defines their Gen Ed courses differently.
• Problems this creates:
 Can this model demonstrate that all NM higher ed graduates have completed the components of their Gen Ed courses? If it cannot, what is the point of the model?
 The model still does not explain how transfer students with Essential Skill conflicts will be affected?
• Will they be allowed to just go on? Then why have this model?
• Will they have to re-take courses or take new courses to satisfy Gen Ed? If yes, then our new Gen Ed model disrupts the entire point of having a common numbering system for the State and increases the cost and time involved for NM students to complete their degrees.
• Will students and their parents be able to understand how this affects them, their learning, and their costs? I’ve a son soon to enter college. I’m faculty and have spent hours working through this model and I can say that the model does nothing to increases the quality of my son’s education and may cause him to take extra classes because his 1000s and 2000s level courses may not transfer properly. I do not think many parents or students would have the tenacity to work through the model long enough to understand how it impacts them.

• Essential Skill Model is inconsistent in how it is structured regarding proficiency expectations and the difference between competency at the AA/AS level vs BA/BS level
• Majority of the courses on the Gen Ed matrix will be 1000 level freshman and some 2000 level sophomore courses  Essential skills and Content Area expectations need to match that course level and not expect proficiency at a 4000s level BA/BS degree.
• If Gen Ed is primarily 1000 and 2000 level courses, why are some rubrics including mastery expectations for BA/BS degrees?
• If information is to be included for BA/BS mastery, could the rubrics all be consistent in usage and language? Could all Rubrics define Developing level as expectation for Gen Ed AA/AS level and Proficiency level for BA/BS?

• Requires Rubric Proficiency for Gen Ed: Math, Quantitative Reasoning, Critical Thinking, Information/Digital Literacy, Social/Behavioral Science
• Requires Rubric Developing for Gen Ed: Science, Creative/Performing Arts, Humanities?, Communications, Personal/Social Responsibility
• Explains both AA and BA Developing and Proficiency levels: Science CA, Creative/Performing Arts, Communications, Personal/Social Responsibility

Content Area or Essential Skill Required Level of mastery
Content Area: Science Meet Developing level of rubric for AA/AS.
Meet Proficiency level of rubric for BA/BS.
Content Area: Math Meet proficiency levels for the appropriate rubric (college algebra, survey of math, or statistics).
Content Area: Communication No rubric included on the website
Content Area: Creative & Performing Arts “any combination of two out of three of the component skills … skill level reached by the end of the course should be at least in the Developing …. Proficiency corresponds to the level anticipated for a Fine Arts major at graduation.”
Content Area: Social/Behavioral Science “Proficiency would be defined at a level appropriate to general education.”
Content Area: Humanities “Levels of … indicate a progression across increasing levels of coursework … Students…not expected to reach proficiency levels as stated in the rubric after an introductory course.”
Essential Skill: Quantitative Reasoning “demonstrate competency at the proficiency level for all three component skills”
Essential Skill: Critical Thinking “a course much cover (to some extent) all four component skills…Student should be able to reach the proficiency level after two courses in critical thinking”
Essential Skill: Communication “At the completion of … General Education curriculum … students should aim for, at minimum, the Developing level for each component skill… should reach the proficiency level by the end of a baccalaureate degree program”
Essential Skill: Information & Digital Literacy “course …should encompass three of the four component skills. Proficiency … is defined at a level appropriate to general education.”
Essential Skill: Personal & Social Responsibility “ course…needs to focus on at least two of the below component skills …student should be at the Developing level in all areas. An undergraduate …should reach the Proficiency level by the end of a baccalaureate degree program”

• Assignments of Essential “Skills considered to be closely associate with the content area” does not make sense. At a minimum, Communication, Information/Digital Literacy, Critical Thinking, and Personal/Social Responsibility should be covered to a certain extent in ALL the content areas. Quantitative Reasoning should be a potential essential skill for math, science, and social/behavioral science although not at the proficiency level posted in the rubric.

• Essential Skill Model is Redundant with all Content Areas – Embed the essential skills
• Many of the essential skill components are already embedded in the Content Area components. This is redundant, needlessly complicates the model, and increases the time and cost required of faculty and school staff to document compliance. The solution is to embed the most relevant essential skill components in the Content Areas and get rid of the 2-tier documentation. If the idea of “Essential Skills” is really important to include in the model then highlighting or tagging these embedded skills would work as well. Embedding the essential skills would also solve the inherent problems the current model introduces into course transferability.

• Essential Skill Model is Redundant for Communications and Math Content Areas as the Communication and Quant essential skills are already embedded
• Logically, ALL programs will choose the Communication Essential Skill for the Communication Content Area and, similarly, the Mathematics Content Area will cover the Quantitative Reasoning Essential Skill. Therefore, all Content Areas EXCEPT Communications and Math are required to cover two Essential Skills. Math and Communications will only cover one Essential Skill different from their Content Area.

• Essential Skill Model removes choice from Math and Science Content Areas
• All content areas must choose two out of three essential skill areas. Mathematics and Science content areas do NOT get a choice. Those content areas MUST cover Quantitative Reasoning as one of their essential skill areas because they are not potentials for any other Content Area.
• Quantitative reasoning is not a good fit for introductory, 1000 level science non-major courses. Most students are either taking their first math class at the same time as their first science course or even putting off their math course as long as possible. A Science course CANNOT cover all of its content and teach all the math at the proficiency level of the Quantitative Reasoning Essential Skill rubric.

Content Area: Lab Sciences
• For AA/AS degree, the Developing level of the rubric is reasonable for non-major’s biology courses.
• The separation of lab and lecture skills needs to be explained in the rubric introduction. The rubric does not explain how “lab” skills are different from science “lecture” materials. The majority of schools in NM now offer lecture and lab courses separately. Their content and grades are no longer tied together as a single 4 credit course. We do not use lab equipment in lecture courses. Will lecture courses need to demonstrate lab skills and will lab courses need to demonstrate scientific literacy skills?
• Scientific Literacy
o Redundancy  at developing level duplicates what is in the Personal & Social Responsibility Essential Skill, Critical Thinking Essential Skill
• Experimental Techniques
o If Quantitative Reasoning Essential Skill is included in 1000/2000 level Gen Ed science courses – it is in opposition to the level of quant reasoning expected in the Developing level of the Science rubric.

Essential Skill: Quantitative Reasoning
• States that Gen Ed student should demonstrate Proficiency on ALL 3 components  Proficiency components are BA/BS level skills NOT sophomore level AA/AS skills and definitely NOT freshman level AA/AS.
• Freshman level non-major biology courses do not teach mathematics nor do they have the time to do so if they are to cover their Content Areas. A freshman level non-major biology course could satisfy the Emerging level of the Essential Skill Quantitative Reasoning rubric and still teach its content but the Developing and Proficiency levels would require that students have ALREADY demonstrated proficiency in their college level math courses which is not a reality because the vast majority of freshman and sophomores are either taking their science and math at the same time OR taking their science BEFORE they take their math course.
o If science courses need to demonstrate Quantitative Reasoning Essential Skills at the Proficiency level then ALL freshman level science courses will need to change their pre-requisites to include college level algebra. This would make it impossible for degree term-by-terms to satisfy accreditation time-to-completion requirements.

Essential Skill: Critical Thinking
• Explanation of the rubric refers to Proficiency mastery after two courses. How will this be tracked and assessed? Which courses get to do Developing mastery and which Proficiency mastery?

Essential Skill: Information and Digital Literacy
• This rubric seems to be trying to combine skills covered in most Content Area while removing all reference to content. Largely about how to evaluate information and use it to do effective research.
• Digital Literacy, computer literacy …. This looks like an afterthought shoved into row 2 of the Information Literacy rubric. It’s too broad and ill defined. It could mean anything. What do you actually want our AA/AS students to be able to do on a computer? An Assessment suggestion is a “typing test.” I value typing as a skill set but I don’t consider a typing test to be a reflection of digital literacy and I would not be happy as a parent or business owner if that was the standard.
• Based on the 3 out of 4 components skill explanation in the rubric, would students be able to take courses for this Essential Skill that only covered rows 1, 3, and 4 and NEVER have to demonstrate proficiency in using computers or digital media? How would this be enforced?
• As the parent of a soon-to-be college student, Information Literacy is a very important skill. Another very important skill is Digital Literacy. Although the kids are using lots of computers in K-12 nowadays they do not understand how to properly use basic programs like Word, Excel to create college level assignments. They do better with Powerpoint. They definitely do not understand how to be safe online or evaluate critically information found online and on social media.
• I’d recommend separating Information Literacy from Digital Literacy. They are two separate ideas. They can be done together or they can be done separately. By pushing them together in this rubric, digital literacy gets short-shrifted and courses that would focus primarily on Information Literacy get forced to teach and assess basic digital literacy.
Faculty and Student Workload is Increased
• Under the current Gen Ed model for Lab Sciences there are 5 competencies. Under the proposed new model there will be a minimum of 10 competencies. This is in addition the detailed course level learning outcomes that are also assessed and, with the new State course numbering system, already largely standardized across NM institutions.
o Currently my science courses utilize one to two assignments/exams to assess the 5 competencies.
o Under the new model I would use 3-4 assignments/exams to assess the science content area and critical thinking essential skill. I would need to significantly revise the course level learning outcomes, remove biology content, and add math content in order to instruct and assess the proficiency level Quantitative Reasoning Essential Skill area.

Type of feedback

  • Models
  • General Feedback