I've been a nurse for 15 years but a healthcare professional for over 30 and I do believe nursing is a profession. The definition of profession or professional, is a group of or singular individual who establishes standards of practice and is self governing. Nursing. I believe that there have been changes in the field both good and bad. Good changes include expanded practice and greater autonomy. Bad changes have to do with the ever increasing non care related obligations which take nurses away from the primary responsibility for the patient. Those include administrative duties, layered documentation, committees, meetings and so on. These matters affect all healthcare professionals not just nursing and are driven by forces outside of the healthcare industry. Hopefully this helps.
Good luck,
R