Zen and the Art of Nonprofit Technology Archive

Archive of Zen and the Art of Nonprofit Technmology

This is the archive of my old blog, zenofnptech.org, which hasn't been up in years. It's in reverse chronological order. Much of it is outdated, and not particularly useful, but I figured it would be good to have it up. -MP

Why Crowdfund this book?

On 02 May, 2013 By mpm

There are a number of ways that organizations in the nonprofit technology field fund writing projects. Some are internal to an organization, others are funded by grants from foundations, and some are funded by vendor sponsorships.

Each of these methods have their upsides and downsides. They provide a great way to get good, solid content in the hands of as many nonprofit stakeholders as possible, and that\'s a good thing. Having been involved in a number of writing projects of those sorts over the years, I know how much valuable content is possible when organizations, funders and/or vendors support writing.

The primary downside is that none of them are independent. The organization will have it\'s particular focus and branding - depending on what their perspective and mission are. Funders have their own missions and motives for funding content. And, vendor sponsorships generally provide some level of limitations on the freedom of the writer to speak critically of any individual vendor or group of vendors collectively, if a sponsor is among that group. For instance, you\'re not going to find deep, critical questioning about the relatively recent consolidation of nonprofit CRM/Fundraising software options in anything sponsored by a particular vendor that shall remain nameless here, but will be named in my book.

Crowdfunding this book frees me to ask big questions without fear. 

Of course, I have my own biases - I am not objective. No one is. But there is no one looking over my shoulder, or directing what I write (or don\'t write.) And I hope that means that I\'ll be able to provide an independent view, and provide a helpful resource, unencumbered by who funds the writing.

But, I have to have your help, the help of the crowd, to get this done. Please consider supporting me. 

Book Excerpt #2: Very Drafty Piece of the Introduction

On 25 Apr, 2013 By mpm

I promised a few excerpts during the indiegogo campaign, and here is another one - a part of the Introduction.**

The idea for this book came one day, as I was pondering the fact that I had stopped the blog that gave this book it's name. I stopped blogging because I felt that I'd said all that I wanted to say, and saying more was just useless repetition. And, of course, in a blog form, it is. I then realized that probably the best way for me to talk about my wide range of issues, questions and ideas around nonprofit technology, was to do what I have been doing in the fiction realm for some years... write a book about it.

So here is that book. If you are looking for a place to read about why you might want to pick Drupal over WordPress, or get advice about Salesforce vs. Raiser's Edge, this is not really the place. Of course, I have my opinions, and you'll hear a bit of that here, but there are many wonderful websites, organizations and people to help you with questions at this level of detail. There are lots of links and resources in end of the book.

This book is about bigger things. For individuals and organizations there are best practices, planning resources and detailed info on technology decision-making from a mission-based perspective. For the sector, there are some moderate doses of philosophy, ethics and politics. I'll start out with a short history of modern technology and its influence on nonprofits, and the evolution of the field of nonprofit technology, and where the field is now. And because there are other voices that should be heard, I'll be bringing in some of them to speak about their perspectives and experiences.

I'll talk about philosophies of technology use in nonprofits, from increasing mission efficacy, improving efficiency, and increasing fundraising. There will be information about the technology factors outside of the sector, such as hardware, software, bandwidth and services. I'll discuss vendors and the sometimes sticky relationships between vendors and nonprofits, diving into questions of not only what makes a good vendor (and what makes a bad one) but also the ethics of the work we do. Finally, I'll dive into talking about open source software and software development more broadly, open data and open standards, and I'll wind up with issues of privacy and security. And there will be moments of Zen, when I ask questions like "why are we doing this, anyway?" and "what is the sound of one hand tweeting?"

This book is for anyone with an interest, whether it be passing or deeply rooted, in nonprofit organizations. You might be the brand new board member of a nonprofit organization, and you've been put on the technology committee. You might be an intern working in the communications department of a big nonprofit, hoping to turn that experience into a career in communications. You might be a long-time member of the nonprofit technology community, and remember what a "circuit rider" was. You might be a developer who has done work on occasion for nonprofit organizations. You might work in, or own, a company whose major market is the nonprofit sector. You could be a student of nonprofit management. You might work in a foundation, and do technology grantmaking. This book is for all of you, and anyone else who finds this topic of interest.

And then, you might ask, who am I to write such a book? I have been working in the nonprofit sector for most of my adult life, in some capacity or another. I was on my first nonprofit board in 1985, and have been in one major role or another in a nonprofit and/or activist organization almost every year since then. I have been involved in starting two nonprofit organizations, and have now been on a governing body of a total of eight, including two churches. I currently serve on two boards: a small Oakland-based food justice organization, and the organization Aspiration, a nonprofit organization focused on nonprofit technology. I am a past board member of NTEN, the Nonprofit Technology Network. I have also spent the years from 1994 through the present, working in a variety of capacities with nonprofit organizations specifically around technology. I have been the in-house (volunteer) techie as well as a consultant and developer for many organizations over those years. Although I wasn't (literally) in the room at the very beginning, I've watched the field of nonprofit technology grow from a hundred or so loosely organized people and a small handful of start-up organizations and companies, to the field that it is today, with thousands of people who consider themselves part of the field, and hundreds of companies and organizations devoted to serving the sector. I have also been other things: a scientist and academic, a seminarian and theologian, and a science fiction writer. And it is likely these other things that gives me a bit of a unique view into the field of nonprofit technology.

This book will inform you, and give you information that will be useful to you in your work in and with nonprofit organizations. It will give you concrete guidance to help you make good technology decisions, whatever they may be. It might also sometimes make you uncomfortable, because there are uncomfortable truths about some of what happens in this field, and about technology in general. I hope that promise won't make you put the book down, but encourage you to read it, and face those things that might be uncomfortable, and start (or in many cases, continue) discussions and dialog about those issues.

This book is much more heavy on anecdotes based on experience and philosophy rather than evidence based on research, but having been a scientist, I'm quite clear when I need evidence to back up an assertion, and I should be taken to task if I don't pay attention to that. That said, every writer is biased, no matter whether they tell you their bias or not up front. I will point it out mine, plainly. I am someone who deeply questions the way our society is structured, and thinks that the benefits of the great innovations of our time should be much more evenly spread. Most people in our society (I'm talking specifically about the United States) work too much and are paid too little. And, to boot, we're endangering our own long-term survival on a planetary scale.

And this perspective has shown itself in how I think about not only technology, but how technology is, and should be, used within nonprofit organizations. I am also a realist and pragmatist. People who have worked with me in the open source/free software movement know this about me. If there is a tiny, scrappy nonprofit, with a leader who finds it easier to get their mission accomplished with a proprietary tool, then I'm not one to say that leader should struggle with an open source tool that isn't up to the task. But I am one to take the entire field to task for not providing better options for that tiny, scrappy nonprofit. I was born and have lived my entire life inside of the United States of America. I've been to many other countries to visit, and I have worked some with international organizations, but I can't in any way say that I have any international expertise. Although this book will certainly have relevance for people who work in NGOs outside of the US, the focus of this book is very definitely the US. I will talk in some detail in the book about some specific international issues dealing with technology, but this book is very much grounded in my experience within the US, and I won't make any claims about the generalizability of these issues outside of the US. I think most of this is relevant to Canadians, and to some extend Europeans, but even in those places, the landscape of nonprofit/NGO technology is quite different, as I've learned.

I've written this book with three basic resources: my own brain and experience, the brains and experience of a few others in the field, and the resources available on the web and in some books (all of which are footnoted in this book.)

The nonprofit technology field is at an interesting moment. The Internet (herein, named 'the Net') has matured to such an extent that it is ubiquitous, and in our very hands at almost any moment. There are organizations from a single-person advocacy or activist organization to multi-billion dollar nonprofits. And serving those organizations in various capacities are legions of individual "accidental techies" to Fortune 100 publicly-traded companies. Observing the field and technology more broadly at this particular moment, and then looking forward seems especially satisfying to me. I hope it will be to you as well.

Continue Reading

Book Excerpt #1

On 18 Apr, 2013 By mpm

I\'ll be posting a series of short excerpts of what I\'ve already written of Zen and the Art of Nonprofit TechnologyI hope this will whet your appetite for more. Please consider supporting me to get this book written.

This is a part of the chapter \"Strategic Planning for Software and Internet Projects\". The section is entitled \"CRM and the Myth of Uniqueness.\"

One of the hardest problems for nonprofits to solve is finding a good CRM (Constituent Relationship Management) that really works for them, and fits their needs. There are a ton of options, and many of them are sub-optimal. Many are exorbitantly expensive. If I were to add in another item to my list of things that keep the both nonprofits, and the sector as a whole from really solving the CRM/Fundraising/Advocacy conundrum would be what I call the \"myth of uniqueness.\"

[Of  course, each nonprofit is unique---unique in it\'s relationship to its  mission, the personalities of the staff and leadership, where it\'s located, its quirks and dysfunctions... But the  uniqueness should really not be reflected in the software it uses.]{style="line-height: 1.5;"}

For most organizations, their perception of uniquness around their data ecosystem comes because way back in 1995, someone (perhaps the ED, perhaps the CFO) got really tired of dealing with paper, and having to generate painstaking  reports manually. It would take them days and days. They were fed up. They hired a database guru (I used to be one of these back then,) or they drafted an accidental techie, or they found out that their Board Chair\'s high school kid knew their way around Microsoft Access, Visual Fox Pro, or Filemaker.

And they built their first database. And it wasn\'t pretty or perfect, but at least the ED and CFO could get five reports quickly and easily. They weren\'t really the reports they really optimally needed, but that was what the system generated, and it worked.  And the staff stopped grumbling about the fact that the tool would lose data in the middle of data entry, and had only a certain size for fields, but they suggested that  their workflow and system be modified to make it easier to avoid those problems and the limitations of the database. And the whole organization, over time, wrapped it\'s workflow around this imperfect tool.

Five, seven, eight, or perhaps even 10 years later, they realize that they are totally outgrowing  this tool, and they need a new one. And so they write up the requirements  of  the tool. For instance, and it has to generate the same five reports that everyone forgot were imperfect. The tool has to fit  into their workflow... the workflow they forgot they built around the unique, imperfect tool that they built first.

We all are unique human beings, but our organs, like our hearts and stomachs look and work pretty much the same. (And no, in this metaphor, the data ecosystem of an org is not it\'s brain - it\'s brain is the people.) Good tools work like organs. I have personal experience with one tool that can work very well for a very small, scrappy organization, as well as a multi-million dollar and huge, huge donor/volunteer organization. I would say there are good odds that this tool will work for a lot of organizations in the middle. I can say from personal consultant experience that a different tool can work for a  Fortune 100 financial services corporation with thousands of users, gazillions of deals, catrillions of dollars in size, as well as it does for a small educational  nonprofit with one user and tens of donors and hundreds of students to track. Good tools can do that.

I'm not going to spend a lot of time going through deep details of how to choose a CRM, or a system to manage your data ecosystem. But I will give you some basic ideas on how to start planning for a change in that data system, and what basic factors to look for in making those choices, and what kinds of effects this change may have on your organization.

First, let's ask my favorite question: "Why are you changing your CRM?"

There are a number of reasons why organizations choose to change the tools they use for their data ecosystem:

  1. The tool is not accomplishing the tasks the organization needs it to accomplish
  2. The tool is deprecated by the software developer
  3. The tool is not cost-effective
  4. You need to integrate with other organizational tools, and your current CRM does not support that integration.
  5. There is a new development director or exective director.

[I\'ll look at each of these. First off the bat---that last reason is the planet's worst reason to change a CRM, even though it happens all the time. It's totally understandable when a new person who is responsible for a huge part of the organization's success, is used to a specific tool, and wants to use the tool they know, and feel comfortable with (and perhaps even feel as if it is the bee's knees.) And, of course, going with my idea that organizations are not all that unique, it shouldn't be a big deal, right?]{style="line-height: 1.5;"}

[The problem is three fold. First, the tool the new person is comfortable with may not really be the best tool for the organization. It may be, but it will take time, analysis and planning to figure that out. Second, it takes time and especially money to do a big data ecosystem change---time and money that may well be better spent on something else. Third, it forces change on end-users that have gotten used to whatever system is in place.]{style="line-height: 1.5;"}

This is, of course, not to say that often, a new CFO, or ED, or Development Director comes in, and realizes that the data system is a mess, and needs to change. That happens as well. But it's important to separate the needs of the organization's data ecosystem with the needs of a new executive.

If the tool is not accomplishing the task that the organization needs, what's important is to look very specifically at the points of pain, and get some advice about how to deal with them. Sometimes, it may not require a wholesale change in the CRM or donation management tool, but may require a new plug-in or module, or to spin off a particular part of the functionality to a new tool, without having to change the whole thing.

[If a software manufacturer has decided to deprecate the tool that has become mission critical, your choices are more limited. With the increasing consolidation of the enterprise-level CRM and donation managment tools, this has become a big issue of late.]{style="line-height: 1.5;"}[..]{style="line-height: 1.5;"}

There is more in that section, and chapter, and of course, the whole book (when I finish writing it)!

Continue Reading

NTAP Report from NTEN

On 28 Mar, 2013 By mpm

There hasn\'t been much fanfare about this, but NTEN released a report recently, called \"Nonprofit Technology Assistance Providers Sector Reach.\" You can read it here

It is definitely an interesting study. The problem is, glaringly missing are the open source focused and vendor-neutral NTAPs, such as Aspiration, the Progressive Technology Project, and others. The only vendor-neutral NTAP included was Idealware.

Research like this is important to our sector. but to so blatantly leave out such important NTAP players in the sector leaves the impression that this was by design. These are active, vibrant organizations that have a great influence on the progress of technology in nonprofits, and to leave them out of a study like this means that the study isn\'t authoritative, and doesn\'t actually measure the true reach of NTAPs in the sector. 

FYI, the study was funded by Microsoft. Things that make you go hmmm.....

Continue Reading

More about that book

On 08 Apr, 2012 By mpm With 5 Comments

As I said a while back, I\'m writing a book about nonprofit technology. It will be titled (this is no surprise) \"Zen and the Art of Nonprofit Technology\". Having been at the #12NTC (Nonprofit Technology Conference) really got me excited about the book. I met people who I would like to talk to about the book, and I got some good ideas about what I might want to delve into. I\'m 10,000 words into the book already, although I have a lot of research to do. I\'ll be filling you all in on more details as it develops, but below is a tentative outline (lots of things haven\'t been fleshed out yet.) Comments and questions are welcome!

  • Acknowledgements
  • Introduction
  • Background and Philosophy of Nonprofit Technology
    • Definitions
    • Brief History of Nptech and current status of the field
    • Should Nonprofits be like businesses?
    • What is technology for, anyway?
    • The Raw Materials
      • Hardware
      • Software
      • Bandwidth
      • Services
    • Technology for change?
  • Harware, Software, and Cloud in the nonprofit sector
    • From servers to phones
    • From word processors to cloud CRM
    • Social Networks
  • Strategic Planning
    • Planning for infrastructure
    • Planning for software/internet projects
    • Planning for communications and social networks
    • \"Best Practices\"
      • How to define them, and where to find them
      • Sector-wide and organizational best practices
  • Vendor/Nonprofit Relationships
    • Technology needs of nonprofit organizations
    • Types of support systems
      • Internal
      • Individual Consultants
      • Cooperatives
      • For Profit Companies
      • Publicly Traded for-profit companies
    • Corporate Philanthropy in Nonprofit technology
    • Philosophy and Ethics of vendor/nonprofit relationships
      • The Ethics of Client/Consultant relationships
      • The tyranny of the hourly rate
    • Nonprofits that support other Nonprofits
      • The NTAP
      • Other kinds of nonprofit technology focused nonprofits
      • The role of foundations
  • Open Source Software in the nonprofit sector
    • Status of the field: Winners and Losers
    • Making decisions around open source software
    • Looking forward
  • OpenAPIs/Open Data/Open Social
  • Are nonprofit data standards an oxymoron?
  • Software Development and the Nonprofit sector
  • Privacy and Security in the networked age
    • Why privacy matters for nonprofits
    • The Identity conversation
    • Security and the nonprofit organization
    • Why net neutrality matters
  • Conclusion

Continue Reading

Example Projects and Portfolio

On 07 Apr, 2012 By mpm

Here are some example projects of my work over the years. (Roles when in a team: PM=Project Management, SA=Systems Administration, IA=Information Architecture, DD=Drupal Development, DB=Database Management, Dev=Code development)

2013 (Mostly with DevCollaborative):

2012 (Mostly with DevCollaborative):

2011:

2009-2010:

Recently (in the last 4 years):

  • Implemented new site for Zen Hospice Center (everything except design/theming)
  • Migrated West Suburban Teen Clinic to Drupal (everything except design/theming)
  • Strategic planning for Revenue Watch Institute
  • Strategic planning for Center for Reproductive Rights
  • Strategic planning for other Reproductive Rights organizations
  • Researched original Idealware report on open source CMS
  • Wrote and updated NOSI primer on open source software

Other stuff:

Continue Reading

Zen and the Art of Getting Your Website Done

On 07 Apr, 2012 By mpm

I\'ve had my sleeves rolled up since the mid 90\'s building websites and web-based databases. It\'s in my blood. I\'ve used most back-end web technologies invented at least once, and I\'ve dived deeply into a number of them over the years. Right now, my focus is on Drupal and websites, and setting up and administering the LAMP/R stack.

I work with the DevCollaborative, and also work on my own. I also do small scale projects for Buddhist organizations on Dana basis.

I have a list of example projects and the like.

Continue Reading

{.post-icon .standard}

Interesting new twist on Android/iPhone divide

On 07 Apr, 2012 By mpm

\

I\'m not at all an Instagram user. Since I am much more language-driven than image-driven, it\'s just not something I\'ve used. But I came across this article about the really weird response of iPhone Instagram users to the reality that they released an Android version.

As an Android user, I\'ve often thought of the Android/iPhone divide as one that was more about open vs. closed, and choice. Android is a sort-of open source mobile operating system, there are many fewer restrictions as to the ways app developers can develop on the platform, and there is a vibrant hacker community that Google doesn\'t bother to police (in fact, the cyanogenmod codebase is hosted on google code!) Apple has been trying to say that jailbreaking your phone is illegal (but the court begs to differ.)

And, of course, there are tens of phones with a wide variety of features to choose from that run Android, and only one iPhone.

It appears, based on the vocal outcry on twitter (well, I don\'t know how scientific a sample that is) that iPhone users think of the Android/iPhone divide as a class divide (and, in some cases, a race divide.) I wonder how widespread this perception is. It\'s also, frankly ludicrous, since the majority of newly-released Android phones cost as much as iPhones - out of the reach of a lot of people in this country (let alone the world,) and you can get both older iPhone models, and low-end Android models for free from some carriers.

Continue Reading

A Book!

On 07 Mar, 2012 By mpm

As I said, I am done. Blogging, that is. But I'm a writer, and the writer in me decided that "Zen and the Art of Nonprofit Technology" is going to be a book. I'll leave you just with that little tidbit. In April (after my lenten social media fast), and after I relaunch my personal site (I'm moving all of the tech stuff there, and this URL will have stuff for the book,) I'll fill everyone in with a lot more details. I'm actually really excited about this project, and have begun to line up folks for interviews, and do research, and all sorts of stuff.

Continue Reading

The. End. (for now)

On 25 Jul, 2011 By mpm With 4 Comments

I\'ve been thinking about the purpose of this blog in my life for the last few months. I started blogging specifically on technology just over 6 years ago, took about a year hiatus in 2005-2006, and have been writing consistently here ever since. But the time has come for me to stop. Mostly, it\'s because I\'ve run out of things to say. On one hand, the technology issues I cover are well covered elsewhere. There are some amazingly good blogs out there focused on the use of Drupal and other open source tools. You don\'t need to hear from me about the newest web tools - you have ReadWriteWeb and Mashable for that, among others. On strictly NPTech topics, I can only say \"nonprofits should use open source software for better sustainability,\" \"there\'s more to talk about than social media,\" and \"all nonprofit software should have open APIs,\" and \"technology won\'t save the world,\"  and \"the nptech world should develop open standards,\" and \"nonprofits should collaboratively develop software,\" so many times. I know that this isn\'t falling totally on deaf ears, but some days it does feel that way. And I\'m kinda tired and bored of sounding like a broken record, so I will stop rotating now. And besides, the landscape has changed somewhat - in some ways better, in some ways worse. I\'ll still be building websites (and their successors) for the foreseeable future with Drupal, and perhaps with whichever cool, new open source development framework comes next after Drupal becomes irrelevant (it will, eventually). And I\'ll be Google+ing (rather than Tweeting, which is mostly for my writing, or Facebooking, which is friends/family) interesting Tech and NPTech topics as they come along and are discussed. And when Google+ stops being relevant, I\'ll find the next thing that comes along to share links and ideas and discuss. But for now, and until I change my mind (I like to keep my options open), this blog will be inactive. Was this blog a success? I don\'t know how to answer. Perhaps you can tell me in comments. For a good while, I had a lot of fun doing it. I hope I was at least a little helpful. Those are enough for me. For the curious (well, OK, it was mostly me who was curious): There are 409 posts and 922 comments. Since since September 2007 when I started to use analytics, there have been 151,000 ish unique page views, and 106,000ish unique visitors.  The most popular pages are (these are fascinating!):

  1. The home page
  2. LibreOffice vs. OpenOffice.org
  3. CRM and CMS Integration: Blackbaud Raiser\'s Edge and NetCommunity
  4. WordPress vs. Drupal... Fight!
  5. What is Cloud Computing?

Continue Reading

{.post-icon .standard}

Interesting sites I\'m looking at (weekly)

On 16 Jul, 2011 By mpm

Posted from Diigo. The rest of my favorite links are here.

Continue Reading

{.post-icon .standard}

Interesting sites I\'m looking at (weekly)

On 09 Jul, 2011 By mpm

Posted from Diigo. The rest of my favorite links are here.

Continue Reading

{.post-icon .standard}

Interesting sites I\'m looking at (weekly)

On 18 Jun, 2011 By mpm

Posted from Diigo. The rest of my favorite links are here.

Continue Reading

{.post-icon .standard}

Interesting sites I\'m looking at (weekly)

On 11 Jun, 2011 By mpm

Posted from Diigo. The rest of my favorite links are here.

Continue Reading

{.post-icon .standard}

Interesting sites I\'m looking at (weekly)

On 04 Jun, 2011 By mpm

Posted from Diigo. The rest of my favorite links are here.

Continue Reading

{.post-icon .standard}

Why all (major) operating systems suck

On 02 Jun, 2011 By mpm With 9 Comments

I\'ve been a user of a ton of operating systems over time. In the past ten years, I have been an everyday user of the big three, Windows, Mac OS, and Linux, for long stretches of time. I switched from Apple to Windows/Linux last year, and I\'ve largely been OK with it, but I\'ve complained enough about all three that I realized that they all suck. Of course, they suck for completely different reasons, which is part of the frustration. And each have places where they shine. Why can\'t there be a nice combination of all three? That would be perfect. Why Mac OS X sucks:

  • Apple is becoming a controlling, closed system, and with the advent of the Apple App store, developers have to go through an approval process to get their apps on the store, there are specific things you can\'t include in an app in the store, and there will come a time when most people get their software through the store, so there will be less and less incentive to maintain non-app store versions of software apps
  • These days, you can find most kinds of software for the Mac, but there still is a relative paucity of apps in comparison to Windows.

Why Windows sucks:

  • Viruses, Trojans and Worms, Oh My!
  • Although I have only seen the Blue Screen of Death once in my year of Windows 7 use, there are still inexplicable slow-downs, crashes, and weird problems. And it takes FOREVER to boot, even with Soluto.
  • Internet Explorer

Why Linux (in my case Ubuntu) sucks:

  • I have to go through arcane (and luckily for me, fairly painless) procedures to get simple things to work (like plugging a headset with a mic into my jack!)
  • Hardware manufacturers ignore Linux for the most part
  • Most software developers don\'t make Linux versions

The only good news I can see is that the operating system is getting less and less relevant. And, on balance, for me, Linux is winning. Now that dropbox and scrivener work on Linux, and I\'m moving from Quicken to some online cloudish thing (suggestions?), I can pretty much leave Windows behind. (Oh, there is still Netflix. Sigh.)

Continue Reading

{.post-icon .standard}

Interesting sites I\'m looking at (weekly)

On 28 May, 2011 By mpm

Posted from Diigo. The rest of my favorite links are here.

Continue Reading

Real Social CRM

On 24 May, 2011 By mpm With 3 Comments

So I do have social media ennui, but I am also somewhat of a data geek, and cool ways of moving social media data into one\'s nonprofit data workflow is pretty important in my most humble opinion. This post on Social CRM is not going to contain one buzz phrase. It\'s going to talk about one particular, interesting example of how to move social media data into a real live CRM -the one you might even be using now - Salesforce. This example uses an app from the Salesforce AppExchange, called \"Salesforce for Facebook and Twitter.\" To make things just a tad confusing, this is also called \"Salesforce for Social Media\" and \"Salesforce for Twitter.\" There are likely many more options, but this is one I\'ve seen that is pretty cool, although it has its weak spots. It definitely is geared more toward the \"Service Cloud\" than the \"Sales Cloud.\" You can set up multiple twitter and facebook accounts, and each facebook account can have access to multiple pages. It\'s all done via OAuth, which is cool. Once you set up the accounts, you can then grab conversations: {.size-medium .wp-image-996 .alignnone width="300" height="158"} You can filter and sort, just like records in any other SF object. You can choose whether or not to send Twitter or Facebook identities to Leads, Contacts, or Person Accounts. You can choose to create cases from tweets or FB posts as well. You can tweet or post to facebook directly from Salesforce: {.alignnone .size-medium .wp-image-999 width="300" height="165"} And it works: {.alignnone .size-full .wp-image-1001 width="362" height="173"} You can schedule tweets and facebook posts as well. There is a lot more you can do - it\'s a pretty cool tool. The one thing I can\'t seem to find - and I don\'t know whether this is in development, or they won\'t ever do it - is import your social graph into salesforce - your facebook fans or your twitter followers. I\'m not sure why this is, exactly. It seems a big gap to me. But then, it is the folks who engage with you who you definitely want to make sure to keep track of. Anyway, if you are a user of either Salesforce, the Nonprofit Starter Pack, or Convio Common Ground, this is definitely a tool to know about.

Continue Reading

{.post-icon .standard}

Interesting sites I\'m looking at (weekly)

On 21 May, 2011 By mpm

Posted from Diigo. The rest of my favorite links are here.

Continue Reading

Social Media ennui

On 17 May, 2011 By mpm With 4 Comments

I have a confession to make. I have social media ennui. I\'m tired of reading and hearing about about social media and nonprofits, and I\'m annoyed that social media is taking up so much of the air space in the #nptech world. As you know, I\'m a bit of a technology curmudgeon, but I\'m far from a luddite - I\'m an early adopter, for the most part. I\'m a fairly active user of Facebook, LinkedIn, Twitter, and some other social networking sites, and have been for years now. I certainly have followed and friended lots of organizations on these networks (particularly on Twitter, but also some more personally relevant to me on Facebook.) The apps I use most on my phone include the Facebook app for Android and Tweetdeck. I spend some amount of my Drupal and WordPress development time, both for my clients and for myself, in setting up one or two-way integrations between websites and social media sites. I understand how the varied APIs work, and have to keep on top of whether I should be using a \"like\" or a \"share\" button for Facebook. I\'ve been using social media to actively promote my new science fiction books. In other words, I don\'t avoid social media, I use it a lot, and I actively facilitate my clients use of social media integration with their web presence. (And I use hashtags in blog entries!) But I\'m still bored silly. Case in point: A new report out from IBM on Social CRM. It\'s geared toward a for-profit audience, but it certainly has some reasonably useful lessons for nonprofits, and it has been a topic of conversation in the #nptech world today. But there isn\'t anything in this report I haven\'t read a dozen times already. It doesn\'t help organizations bridge the huge data and workflow gap present between their traditional CRM/Donation management systems and their social media interactions. And if I hear the buzz phrase \"game changer\" one more time, I\'m going to puke. It\'s hype designed to sell things. And hype designed to sell things isn\'t necessarily going to help make the world a better place. No one should take this post personally. I\'m very glad that most of my #socialmedia #nptech colleagues talk a lot about ROI of social media, and really try and figure out what works, and what doesn\'t. But we\'ve had, what 3 or 4 years solid of nonprofits using this stuff. Can it be demoted now? So what do I want us to talk more about? How about lowering the costs of software by using open source and collaboratively developing software? How about data standards to help us share information more easily? How about finishing the work we did on getting the expensive CRM vendors to really open up their APIs so that organizations can better integrate their systems? Maybe talking how to deal with neglected nonprofit verticals like client management? Helping accidental techies get the training they need so that they can do more work in-house? Nonprofits who need tech help partnering with local organizations who provide training to the unemployed and ex-offender? The list goes on and on.  

Continue Reading

My Tools: Writing

On 11 May, 2011 By mpm

I\'m mostly doing this last post on my tools to pimp Scrivener. I was a loyal Scrivener user on my Mac for years, and then when I moved to Windows last year, I mourned my loss terribly. But then! Then someone started to work on Scrivener for Windows and Linux. Almost enough to make a grown woman cry. I do just about all of my novel writing on Scrivener. It\'s great for outlining, for research, for writing scenes, etc. And it has a great compile function, to spit it all out into a manuscript when it\'s ready to edit. I have probably only used 30% of it\'s features, but I love it, and look forward to using it. (Am I really looking forward to using it, or just looking forward to writing...?) I use LibreOffice for most other writing and editing tasks, although sometimes I must sadly use MS Word for some stuff (like some ebook converters have a harder time with LO files, even formatted as .doc.) I\'ve been experimenting using Scribus for page layout. I use GIMP for any graphics manipulation I need for cover art and such. And, of course, I do a lot of writing on WordPress and Drupal.

Continue Reading

{.post-icon .standard}

Interesting sites I\'m looking at (weekly)

On 07 May, 2011 By mpm

Posted from Diigo. The rest of my favorite links are here.

Continue Reading

{.post-icon .standard}

Tools I use: Personal Web Presence

On 06 May, 2011 By mpm

I\'ve had a web presence of some sort since way back when most personal URLs looked something like: http://somecollege.edu/~username. In 2002 or so, I ditched HTML for a series of CMS systems for my personal stuff. I started out using the CMS I wrote in Perl, called XINA. (Those were the days.)  Anyway, that was then, and this is now. Here\'s what I use. Software:

  • WordPress - you already know it and love it. I use it for this blog, only. I used to have two blogs on WP - this blog and my personal blog, but I moved my personal (and author blog) to Drupal, to integrate it with other stuff I had online.
  • Drupal - I use Drupal for my personal blog and also other purposes, like the website for my intentional community. My main personal site will be migrated to Drupal 7 soonish. My main sci-fi author site is already on Drupal.
  • Dokuwiki - my woefully neglected and out of date technology wiki is on Dokuwiki. Dokuwiki is a very cool tool. It\'s a wiki, but everything is stored in files instead of a database. It makes it quicker, and also much more easily migratable. The annoying part is that it is one more wiki markup to learn (I wish SOMEONE would finally agree to make a wiki markup standard!!)
  • In the relatively rare case where I need to use HTML/CSS for web pages (there are a few legacy sites I maintain for friends) I use Bluefish (on Ubuntu.)

What I like most about WordPress is that I don\'t really have to do any work to use it, or tweak it. I love how easy it is to use. I love Drupal for its flexibility - and for my personal stuff, it\'s really great to be able to mix and match stuff (like I actually have two different blogs on that site, but it\'s really only one blog... Drupal is ace at that sort of thing.) I keep debating about whether or not to migrate this blog to Drupal. Stay tuned. Hosting: All of my personal stuff is on Dreamhost. I say this with some hesitation. I have hosted with Dreamhost since 2007. They are worker-owned, pretty green, and their newsletters are quite humorous. They give free accounts to nonprofits. Their service has improved over the years, but they ultimately aren\'t all that reliable. They have downtimes (a really bad one recently,) Drupal often barfs on Dreamhost during admin tasks, and you can\'t run Rails apps reliably at all. I\'m going to spend the spring and summer migrating all of my domains (there are plenty!) to a VPS on Linode (this will be my chance to play with IPv6, too. I already use Linode as a development server.)

Continue Reading

{.post-icon .standard}

Interesting sites I\'m looking at (weekly)

On 30 Apr, 2011 By mpm

Posted from Diigo. The rest of my favorite links are here.

Continue Reading

My Tools: Development

On 25 Apr, 2011 By mpm

Since I am a web developer, the core of my development workflow is, for sure, a browser. But not just one browser, or any browser. Several. Chrome has become my everyday browser, although Firefox is making its way back into my heart, now that Firefox 4 is so lean and zippy. But I am very often in both. I use Opera on occasion, and, of course, I use IE only when I absolutely have to (and it generally means rebooting into Windows, which I do less and less these days.) My other core tool is a console window. In Linux, I use the generic version. For Windows, I use SecureCRT, which is well worth the \$ since putty is not up to the task (I know, it\'s open source, which is great. But it just doesn\'t cut it if you need to use it pretty much all day every day with multiple servers.) My text editor of choice is Emacs. Yes. Emacs. {.alignnone .size-medium .wp-image-976 width="400" height="150"} For Windows, I love Notepad++, a sweet open source text editor. I like Eclipse as an IDE, its awesome. I think it\'s better than the proprietary Komodo, but that\'s just me, I\'m sure people beg to differ. Other core tools are git for version control and github for code sharing. I haven\'t found a GUI git client I like, so I just use the command line. IRC and Pastebin rock my world for getting help in troubleshooting problems, and IRC is great just for chilling with other developers.  

Continue Reading

{.post-icon .standard}

Open Source vs. Proprietary: Web Server Software

On 25 Apr, 2011 By mpm With 1 Comments

By Web Server Software, I mean the software used to serve websites/pages. This includes databases, operating systems and other software that is involved in that process. On the proprietary side, there are two options. Proprietary Unix, and Microsoft Windows, and associated Microsoft Software. The current version of MS Server in use is Server 2008. Microsoft has web server software called IIS, and it\'s database server product is MS SQL server, which people use for far more than just serving web site data. The primary web development framework used in this environment is .NET. Proprietary UNIX has dwindled greatly in popularity with the increasing popularity of Linux. On top of proprietary UNIX, people will generally run associated open source server software for web, database and development frameworks. On the open source side, Linux is by far the most popular, with BSD in second place. Both Linux and BSD come in several flavors (or distributions.) Apache is by far the most popular web server software. MySQL and PostgreSQL are the open source database systems most in use for web servers, with PostgreSQL being a pretty distant second to MySQL. Other database systems (such as NoSQL variants) are increasing in popularity, but are pretty far down from MySQL as well. Also, it is possible to run Apache, most varieties of open source databases and web frameworks on Windows, and that is not uncommon. It\'s hard to know what the market share of server operating systems are, because there are different ways to measure it. You can measure how many units are sold. By that measure, Windows is first at about 49-67%, Linux is second at 16-23%, and proprietary UNIX is third at 7-22%. That underestimates things like self-installed OS systems (standard with Linux), as well as VPS systems. If you measure by surveying publicly accessible websites, you get Linux first at 41%-74%, Windows second at 20-42% and proprietary UNIX third at 2-5%. This underestimates servers inside enterprises. (source: wikipedia) From my perspective, the underestimation of self-installed and VPS systems by the first measure far outweighs the underestimation of enterprise servers, because plenty of organizations and enterprises also install Linux behind the firewall. It would make sense to me that the true number is much closer to the estimation by publicly accessible websites, rather than the unit sales estimation. So on the OS side, Linux does look like it wins. Apache is far and away the most popular web server software. It is way ahead of IIS. The most recent data from Netcraft shows that Apache has 63% of web servers, compared to 19% for IIS. Also, Apache is showing a clear upward trend, and IIS a clear downward trend.  

Continue Reading

Tools I use: basic workflow

On 12 Apr, 2011 By mpm

I was perusing Social Source Commons (something I don\'t do nearly often enough,) and catching up on the SSC blog, and I thought it might be worth sharing with this audience what tools I use for basic consulting workflow. I\'ll do another few posts for other areas, like development, system maintenance, personal web presence, and writing. (If you want to look at my Social Source Commons toolbox, it\'s here. It\'s not so up to date, and it\'s a list more of tools I have used, and some I still use.) The center of my workflow, like for most consultants, is email. I\'ve used a variety of email clients of one sort or another over time, and I have recently just decided to ditch them, and use gmail exclusively. I have definitely noticed that I\'ve been migrating a lot of functionality of things that I do to web-based apps of one type or another, and this is one example of that. I use Canned Responses to provide HTML signatures when needed, and also forward all of my mail to gmail, then send out mail as other identities. (I\'ve learned how to circumvent that annoying thing of \"Sent on behalf of\" in gmail - use the SMTP of the email alias you\'re using.) What\'s also very close to the center is my project management tool, Redmine. (I\'m actually now using a very recent fork of Redmine, called Chiliproject.)  I\'ve waxed on about this tool ever since I\'ve found it, and I would love to challenge a loyal Basecamp user to a point-by-point comparison of the two tools. I think it knocks Basecamp right out of the water. It\'s core is a very powerful and flexible ticket tracker, but it includes all of the important project management features you want and need, milestones, time tracking, wikis, file repository, even discussion boards, and it connects with version control repositories. It works for multiple projects. And, it\'s open source, and isn\'t even that hard to get set up and running. Another important tool, which I use in my personal life as well as consulting life, is Evernote. Evernote rocks my world. The web interface is great, as is the desktop application (which I use cross-platform - the Windows version works great with WINE). I also access Evernote on my Android phone. It\'s a great tool. I use it for to do lists, stuff like blogging calendars, and also the Chrome Evernote extension allows for clipping of whole web pages, which I love (there is a Firefox extension as well.) A tool I\'ve recently come to adore is Passpack. It is an awesome web-based password management tool for teams. I love the collaboration features. For sharing files, as well as providing solid file backup, I use Dropbox (it even works on Linux!) And, like all consultants, workflow involves documents and spreadsheets, and for that I mostly use LibreOffice, although sometimes using Google Docs makes sense for collaboration. I use Google Reader for RSS feeds, and TweetDeck, or, more recently, HootSuite for Twitter (I really like the tabbed interface of HootSuite. It makes looking at the variety of lists I have a lot easier.)  

Continue Reading

Bing and Google

On 11 Apr, 2011 By mpm

I do a fair bit of SEO work for clients. I\'m not one of those very serious SEO folks, but I do know my way around the not-so-black-magic that SEO is. This blog is the one of my many personal sites that I pay the most attention to SEO (although I think that will change soon,) I tend to focus a lot on Google, since according to my analytics (and yes, they are Google analytics. I\'m wondering whether I should check out my server logs...) 99% of the traffic to this blog that comes from search engines comes from Google. But according to this article in Mashable,  Bing gets 30% of the overall search engine traffic. Now, I already know my audience is different, but that seems, well remarkably different. In the last month, 3,743 visits came from Google, and 43 came from Bing. And I thought \"aha! So I haven\'t been paying attention to Bing in my SEO efforts - that must be the problem!\" So I did some benchmarking. No, that wasn\'t the problem. In fact, in general for the set of phrases I used for benchmarking, Bing more often had me higher up in the results than Google! Things that make you go hmmmmm....

Continue Reading

{.post-icon .standard}

Interesting sites I\'m looking at (weekly)

On 09 Apr, 2011 By mpm

Posted from Diigo. The rest of my favorite links are here.

Continue Reading

{.post-icon .standard}

Open Source vs. Proprietary: Graphics and Video

On 07 Apr, 2011 By mpm With 2 Comments

There are some very interesting comparisons to make in this realm, and, I\'d say first off, that the proprietary tools are in the lead, for sure. I\'ll start with basic graphics - graphic manipulation tools. Of course, on the proprietary side is the ever present and predominant Adobe Photoshop and Illustrator. And, honestly, they are very good tools, and considered industry standards. On the open source side, the projects that stand out are GIMP (a Photoshop replacement) and Inkscape (a vector graphics program - like Illustrator). I\'ve used GIMP for many years, and I don\'t generally do a whole lot with graphics, but it always serves my needs.  There has been a lot of back and forth about the GIMP user interface. It is very unlike that of Photoshop. So much so, in fact that someone came up with another project called Gimpshop, which re-does the UI to better match Photoshop. Both GIMP and Inkscape are completely cross-platform, and available for Mac, Windows and Linux. I\'ll leave it to the graphics professionals to say for sure, but they are both worth a look if you don\'t want to plunk down hundreds of \$ for Photoshop and Illustrator, and/or you like to work with open source tools. The other realm of stuff that I know some about is video. In term of viewing, on the proprietary side are the players that come with the proprietary operating systems. Itunes/Quicktime comes native with MacOS, and Windows Media Player for Windows. One doesn\'t have to pay for these, so it\'s a bit hard for open source (or other products, even) to compete. Which is perhaps why the other major proprietary video player, Real Player, has had such a hard time catching on for all of these years. I notice now they seem to have added a ton of features (like video conversion from one format to another). On the open source side, one program you must know about is VLC by VideoLAN. Totally cross-platform (so cross-platform, they have a version for BeOS!) It plays everything. I mean, everything.This means you don\'t have to have several video players around to play different formats. I use it constantly - it\'s my go-to video player. It has a bunch of other features as well. In terms of video editing, again, the proprietary programs have somewhat of a leg up on the open source, although a recent entry into the field may well change that (see below). On the \"low-end\" (for people like me who make videos like this,) there is, like in the video playing arena, iMovie and Windows Movie Maker, made by Apple and Microsoft respectively, for their own platforms. (An aside, a lot has been said about the crapware iMovie has become - it used to be a really good video editor.) There are other proprietary products as well. I\'ve used TrakAxPC, which has a free version and a paid version. There are a variety of other low-end video editing options. There are low-end versions of Adobe\'s Premier (called Elements) and Apple\'s Final Cut (called Final Cut Express.) On the high end (where I\'d love to work more), there is Apple\'s Final Cut (only available on Apple hardware) and Adobe Premier (cross-platform). There are also quite a number of high-end, Hollywood products, like Avid (a side note, I used Avid a little bit, way back when it was the first and only non-linear video editing platform). On the open source end, there are some notable entries. Blender is a very popular cross-platform open source 3-D modeling, animation and editing tool. It\'s actually pretty amazing what it can do. (There is a study that compares a bunch of 3D tools for professionals, you can see how Blender stacks up.) Another notable entry is Cinelerra, which only runs on Linux. (You can see videos edited with Cinelerra on Vimeo.) A recent entry into the fray, and the one that might make a huge difference, is Lightworks. This is one of the video editors that Hollywood uses that used to be proprietary. It will go open source later in the year, but you can grab it for free right now. Yes, a Hollywood-quality video editor for free, and soon to be open source. It\'s Windows only for now, though. In summary, proprietary software has the popularity edge, mostly. From this non-graphic professional\'s perspective, it seems that one would not be left wanting if you went the open source route, however.

Continue Reading

Web Application Frameworks

On 06 Apr, 2011 By mpm

If I got a dollar for every time I heard something like: \"we\'re trying to choose between Ruby on Rails and Drupal for our new website\" or \"our developer convinced us to do our new website in Ruby on Rails and we can\'t update it,\" I wouldn\'t be rich, but I\'d have some money for a very nice meal at an expensive restaurant. I know a lot of pretty serious geeks read this blog, but I also know some folks who aren\'t do too, and I figured it was time to do a quick outline of web application frameworks, and how they differ from things like a CMS. A web server, in the physical sense of the phrase, is a box sitting in a data center (or under someone\'s desk) with a unique IP address, that answers queries from the internet and serves up data, depending on the request. In the software sense of the phrase, it is the actual piece of software (most often Apache, but sometimes something different.) That software runs in the background, and and listens to requests, then serves up the data.  That data is in some form of HTML, CSS and Javascript, because that is what browsers understand. However, how that HTML, CSS and JS is generated varies depending on the system underneath. In the old days (when I was starting with web programming, back in the early-mid-90s) it was all HTML flat files (and not even much in the way of CSS or JS at the time.) And dynamic elements were less common (you remember those days.) Now, a minority of web servers actually serve HTML files - they serve HTML, CSS and Javascript dynamically generated by software, like, in the case of this page you are reading now, WordPress. WordPress, Drupal, and Joomla are CMS systems that are written in PHP. PHP is one of many programming languages. Plone, for instance is written in Python. This isn\'t really the place to describe what programming languages are, or how they work, but Wikipedia (as always) as a nice entry, worth a read. CMS systems are full-featured - they require no programming to install or configure or get going, or to create a usable interface. They may require some to customize in particular kinds of ways, but I\'d say most nonprofit websites don\'t need to do that. Most Drupal developers, for instance, don\'t spend a whole lot of their time in code unless they work on contributed modules (or contribute patches and such to core.) A web application framework is one that does require programming to provide the basics of a user interface. The cool thing about frameworks for developers is that it provides a great leg up, and a way to use the model-view-controller design pattern really easily - it\'s a powerful way to do development. The advantage of a framework is that it allows you to do great custom apps a lot easier and quicker than before (many web 2.0 apps are written using these frameworks). The disadvantage to a framework is that it does take significant programming to get user interfaces (especially on the admin side) working well. So to use them to build a CMS (or a CRM, for that matter) is probably not a great idea, given the plethora of already-cooked options in the world. People who are working with frameworks are spending much of their time dealing with code. Popular web application frameworks include Ruby on Rails (using the Ruby programming language,) CakePHP (using PHP), and django (using Python.) Ruby on Rails is arguably the most popular MVC web framework at the moment, but there are a lot of folks using the others. The PHP frameworks (which include Cake, as well as Symfony and Zend) are pretty popular because of the plethora of PHP programmers out there. All of these frameworks get more sophisticated every year, and they are interesting to watch.

Continue Reading

Open Source vs. Proprietary: CMS

On 04 Apr, 2011 By mpm With 5 Comments

Content Management Systems are an essential part of the communications function of nonprofit organizations. There are a myriad of options, open source options are among the most popular, possibly the most popular. I\'m going to focus here on the nonprofit sector, and options that are most common among nonprofits. On the proprietary side, there are a number of options, and they fall into three categories:

  1. Single-source proprietary custom CMS (from one web shop, or web host)
  2. Proprietary CMS as part of a large package (such as from Convio or Blackbaud)
  3. Proprietary stand-alone CMS (such as Sharepoint.)

You already know what I think about option 1, so I won\'t belabor it here. Many people have found that option 2, using a large package, which includes donation pages, event management, etc. can be a really good option, and I certainly don\'t want to say that this is not a good idea - I think it can be - but it also can be quite costly - and for many organizations, it\'s overkill. And there are open source options that can do much of the same work for much less money. There are not a lot of stand-alone proprietary CMS systems in nonprofits these days. Microsoft Sharepoint might be the most common I\'ve heard of. Ektron is another one that I\'ve heard folks talk about, as well as ExpressionEngine. The advantage of using Sharepoint for Microsoft-centric shops is that there is full integration with lots of internal network resources. The open source options are many, but the big four: WordPress, Drupal, Joomla, and Plone, stand out from the pack. As you know, I am pretty loyal to Drupal (and secondarily, WordPress) but I have to say that Joomla and Plone are solid, wonderful projects, with great communities, and active development, and will serve you well. Check out Idealware\'s newish comparison of the four - it can help you figure out what\'s best based on your needs. Other open source options that I think are worth looking at include: Alfresco, which is heavy on the document management functionality and DotNetNuke, which is based on .NET, and somewhat popular among Windows users. Two up and comers I am very interested in following include Radiant and Refinery, both based on Ruby on Rails. There is also Django-CMS, written on top of the django framework (a python framework.) If you\'re really interested in open source CMS options, and looking not for data on features, but for data on popularity, marketing, community and such (a good idea if you are, for instance, a shop deciding what CMS systems to develop with and support) check out this report from Water and Stone (a digital marketing agency.) I think on the whole, though, the number and richness of options on the open source side is quite a bit better than that on the proprietary side, and until I get an answer to this question, I can only guess that open source options have won over proprietary ones in the nonprofit sector.

Continue Reading

Interesting Sites I\'m looking at

On 02 Apr, 2011 By mpm

Posted from Diigo. The rest of my favorite links are here.

Continue Reading

Drupalcon Highlight Reel

On 31 Mar, 2011 By mpm

I didn\'t make it to Drupalcon Chicago, but, thanks to the organizers of the conference, it doesn\'t mean I need to miss the sessions. I\'ve been looking through videos both the regular sessions, as well as the ignite sessions (thanks, \@gregoryheller), and here are my highlight presentations (this does reflect what I\'m interested in more than it reflects what\'s the best of DrupalCon):

Continue Reading

Drupal/Salesforce Integration

On 23 Mar, 2011 By mpm With 9 Comments

A bit over a year ago, I wrote a post about the status of Drupal/Salesforce Integration. I figured it was time to do an update. At the moment, if you want to integrate Drupal and Salesforce, you have three options:

  1. Use the contributed modules (or have a developer install and configure them for you).
  2. Use Jackson River\'s Springboard.
  3. Roll your own (or have a developer roll your own for you.)

I\'m going to talk in much detail about #1 in a bit. I\'ve not had any experience with Springboard, but it\'s important to understand that it is not open source, and is only maintained by one shop. That is going to be an inherent weakness - no matter what. I don\'t know enough about it to match it to the contributed modules, but it\'s hard to imagine that it\'s possible for it to keep up, given the nature of open source development. All of that said, it\'s supposed to be an interesting all-in-one sort of option, so it\'s probably worth a look. Rolling your own is always a precarious proposition. I frankly can\'t imagine much of a situation where  it would be preferable to modifying what\'s available and contributing the mods back. So what is the status of the Drupal modules? Right now, there is an alpha release for Drupal 6, which is alpha in that very humble open source sense - it\'s being used in quite a number of production sites. It includes some great stuff. You can see an overview here, in the slide deck for a talk given at NTC last week, which compares the integration of Salesforce with 3 of the big open source CMS platforms, Plone, Drupal, and Joomla. There are four major projects:

  • Salesforce Suite, which includes:
    • The API - the core module that does the communicating with the Salesforce API
    • Contrib - a module that provides support for import/export from contributed modules
    • Export Queue (experimental) for queuing exports
    • Import - importing data from SF
    • Match - for matching objects before creating new ones
    • Node - for linking Drupal nodes to SF objects
    • Notifications (experimental, sort of - it\'s worked quite well for me) - allowing Drupal to handle SF outbound messages
    • User - matching users to SF objects
  • Salesforce/Ubercart - provides integration for Ubercart. Uses the Salesforce Suite API
  • Salesforce Feeds - allows for feeding SF records into Drupal via Feeds. Also uses the Salesforce Suite API
  • Salesforce Webform - Allows for passing data from a Drupal Webform to Salesforce. Currently does not use the Salesforce Suite API, and cannot be used on the same site as the Salesforce Suite, but hopefully that will change soon.

All of these modules are actively maintained, there is an active base of folks using and contributing (including me) and there are plans afoot for Drupal 7, with big improvements. Of course, there are still some snaggy spots, and it helps if you know some about Salesforce to have this work really well, but I\'ve gotten good results doing two-way sync of user and node data with the Salesforce Suite, as well as used the Salesforce Feeds module some. If you use Salesforce, want integration, and are pondering a CMS choice, definitely check out the overview slides. If you are using Drupal, want integration, and considering a CRM, definitely consider Salesforce. And if you are already using both, and looking to find ways to integrate them, drop me a line, I can either directly help you, or point you in the direction of folks who can.    

Continue Reading

{.post-icon .standard}

Open Source vs. Proprietary: Nonprofit CRM

On 16 Mar, 2011 By mpm With 5 Comments

CRM systems (which I am defining rather loosely, rather than tightly, for the purpose of this post - as the tool or set of tools used to track constituents, donations, perhaps even events and volunteers) are arguably the most important technology tools that nonprofits use. Organizations use this tool to track donors, send out newsletters, track the success of campaigns, track who is engaged with the organization in what ways, etc. And, in my experience over the past 15 years, it\'s where organizations are willing to spend the most money on technology - often more than on their website or other technology tools - for good reason. Because of this, the deck has always been stacked against open source tools in this arena. The sheer number of vendors providing this toolset for nonprofits is huge (although rapidly shrinking.) Two of them (Convio and Blackbaud) are even publicly traded companies, which says a lot about the profit potential of this vertical. On the proprietary side, there is a wide range of available tools, from the relatively inexpensive, like Salesforce (web-based, including Convio Common Ground and the Nonprofit Starter Pack,) eTapestry (web-based, now owned by Blackbaud), Democracy in Action, and GiftWorks (desktop) to the egregiously expensive (you know which ones I mean.) Both NTEN and Idealware are the best sources for information about the range of options for this toolset - that\'s out of scope for this post. As you can tell, I\'ve lumped SaaS tools like Salesforce, DIA and eTapestry in with proprietary in this post - that\'s because that\'s what they are - proprietary. However, Salesforce in particular has a leg up that most other proprietary tools don\'t have, because of their open APIs and their incredibly robust development platform. That combination is impossible to beat if you need integration, ease of data movement, and a lot of customization. From my perspective, open data (via open APIs) can sometimes be more important to consider than whether or not a tool is open source - since integration with other tools, as well as using external tools of various sorts is critical. Closed data systems, difficult to integrate systems, or systems that require payment to get access to your data should be avoided at all cost. On the open source side, there are a number options: you can choose an open source CRM package (designed for business), like SugarCRM, and use it or customize it for use in a nonprofit, use CiviCRM, or choose the desktop-based nonprofit CRM called MPX (built by a company called Orange Leap.) I\'m excited about a new Drupal project called \"Red Hen CRM\" but it\'s very fledgeling. CiviCRM is a web-based open source nonprofit-focused CRM/Donation management tool. It\'s been around for a while now, and is used by many organizations, some quite large (like the Wikimedia Foundation.) It is quite broad in its feature set - it has donation pages, event management, e-newsletter functionality, even a case-management system. I\'ve installed, configured and administered CiviCRM many times, still work with it, and I have, like most developers, a love/hate relationship with it:

  • I love that it\'s open source/free software
  • It\'s got a great community of developers and users
  • I love that it\'s feature rich - you cannot find the whole set of things it does in any proprietary tool that I\'ve seen.
  • It is a tool that has unmatched cost-effectiveness for small organizations
  • It\'s great that it integrates with both Drupal and Joomla (although the Drupal integration is by far the most solid - and it is a very nice integration - hard to get with proprietary tools.)
  • It is relatively easy to set up for most functionality

But ...

  • Data migration into CiviCRM is often nightmarish (this is really where the hate lies)
  • Reporting tools are improving, but don\'t match the proprietary versions
  • It can sometimes be pretty tough to handle complex issues
  • It can be tough to troubleshoot issues

MPX is a desktop tool, and although it is open source (GPLv3,) unlike CiviCRM, or SugarCRM, it is built on top of a proprietary stack (.Net/MS SQL Server.) It has primarily been used in faith-based organizations (that is Orange Leap\'s primary client base.) But it\'s a very full featured product, and quite mature. So if you are a small organization that perhaps is still working with spreadsheets, CiviCRM is a great idea to check out. But in general, there are a lot choices and, sadly, few of them are open source.  

Continue Reading

Alternatives to MySQL

On 14 Mar, 2011 By mpm With 5 Comments

For those of us that depend on MySQL everyday, the buyout of Sun (which had bought MySQL) by Oracle did not bode well. A decidedly biased survey by the folks behind PostgreSQL suggests that many people worry about the health of MySQL in Oracle\'s hands. I\'ve mentioned this before, and I do think the conventional wisdom is that open source software (which includes OpenOffice.org, MySQL and Java) will not flourish at Oracle. It makes sense - Oracle has never had a culture of fostering open source software, and it seems unlikely to obtain one. So what does someone do who builds their houses right on top of the LAMP stack (M standing for MySQL)? For most folks, especially if they build on shared hosting infrastructures, this just isn\'t an issue. They depend upon their hosting providers, for whom it may or may not be an issue - but they won\'t have to think about it. For those folks in a position to choose which database software to use, (for example, you use VPS systems like Amazon, Slicehost, Linode, etc.,) then I think there are two pretty good options:

  • Go with MariaDB, which is basically a drop-in replacement for MySQL (and conveniently starts with an \"M\".)
  • Switch to PostgreSQL.

MariaDB is a branch of MySQL that came about because of the uncertainty relating to Oracle\'s ownership of MySQL. From the website:

In most respects MariaDB will work exactly as MySQL: all commands, interfaces, libraries and APIs that exist in MySQL also exist in MariaDB. There is no need to convert databases to switch to MariaDB. MariaDB is a true drop in replacement of MySQL! Additionally, MariaDB has a lot of nice new featuresthat you can take advantage of.

The problem is that the major Linux distributions (Ubuntu, Debian, RedHat) don\'t yet have MariaDB in their repositories, so it will be a while before MariaDB is an easy apt-get or yum away from installation (there are some independent repositories and builds.) PostgreSQL is a different beast entirely. It\'s been an also-ran in the open source database race, and I was, for many years, quite faithful to it. It\'s a very solid database, and it was ACID compliant before MySQL was. It\'s major weakness (and why the LAMP stack is called that and not the LAPP stack) was that it was a fair bit slower than MySQL. But  that weakness has long been taken care of, but the damage was already done. Many open source web database systems can use PostgreSQL instead of MySQL at this point. But PostgreSQL doesn\'t have the same large user base, and doesn\'t have many of the same web-based and desktop tools that MySQL does. There are differences in the SQL commands and such, and the command-line interface looks different. There is also a big difference in how Auto-numbered fields get handled, but that\'s not really an issue that folks who don\'t dive into deep database and code need to deal with. So which to go with? It probably makes sense to wait a bit, first for MariaDB to make it into mainstream repositories, etc., and also to see what the fate of MySQL is. And checking out PostgreSQL is always a good option, it\'s a very good database system, and the likely flight from MySQL might do the project some good.

Continue Reading

{.post-icon .standard}

Open Source vs. Proprietary: Browsers

On 14 Mar, 2011 By mpm With 2 Comments

The browser wars between proprietary and open source browsers have changed in some ways from the days that it was simply Internet Explorer vs. Netscape. There are more players on both sides of the field, with some interesting complexities. On the proprietary side, still, sits Internet Explorer, now about ready to pop with version 9. It definitely depends on who is gathering the data, but IE has about 44% of the market. This is down significantly from its high point, back in the dark ages of 2005, when it garnered over 90%. It has been dropping steadily since. This drop has been primarily, but not exclusively due to the open source browser, Firefox. More recently, however, two other proprietary browsers, Opera and Safari, have been increasing their own market shares. Now, Opera gets about 2% of the market (up from microscopic some years ago.) Safari, used mainly by Apple Mac users (although there is a Windows version) now gets about 5% of the market. On the open source side, Firefox is certainly the leader, with a bit less than 30% of the market. Chrome, which is sort of an open source browser, is now getting around 14% of the market. So what do I mean when I say Chrome is \"sort of an open source browser\"?  Chromium, is the open source project which results in the browser Chrome - but there are a bunch of additions Google makes to Chrome which are proprietary, and not in the Chromium codebase. So, anyway, basically, between Firefox and Chrome, the open source side is a smidge in the lead over the proprietary side, but it\'s pretty close to even. And still, the primary reason for the difference is that IE still ships with Windows (and Safari with Mac OS X), and if people don\'t take the step to download and install another browser if they are a windows user, they will still just be using IE. In the mobile space, things are very interesting. Opera mobile is in the lead, with about 21%, followed by iPhone, Nokia, and Blackberry. These are all proprietary. Bringing up the rear is Android, at 16%. But I\'m sure that is going to change as Android begins to gobble up the moble smartphone market share.

Continue Reading

Take this short survey

On 13 Mar, 2011 By mpm

Take the Ada Initiative
Census{.alignright width="200" height="234"} The Ada Initiative is a nonprofit organization that works to support women in open culture (open source software, open standards, open content, etc.)  Really great stuff. They have a new census that they are encouraging people to take. So please take it, and spread the word.

Continue Reading

{.post-icon .standard}

Open Source vs. Proprietary: Desktop Productivity

On 07 Mar, 2011 By mpm With 6 Comments

I recently wrote a blog entry about LibreOffice (LO), the fork of OpenOffice.org (OOo) that came after the acquisition of Sun (the old holder/maintainer of OOo) by Oracle. For the purposes of this blog entry, at this moment in time (early 2011,) LibreOffice and OpenOffice.org are the same. (funnily enough, for both packages, the executable is still called \'soffice\' - for \'Star Office.\') I\'ve been using this tool since it actually was StarOffice, more than 10 years ago, when it was first open sourced by Sun in 2000. For most of that time, except when I was doing heavy collaborative editing with colleagues who are using MS Office, it is my Office Suite of choice. There have been many times, over the course of the years, where there are things I\'ve thrown at OpenOffice.org that it couln\'t handle, but those things have diminished year by year, and over the past couple of years, I\'ve had absolutely nothing to complain about (nor have I submitted any bugs, which I did a fair bit of in the early 2000\'s.) I would take a bet that 90% of people who use MS Office don\'t need to pay for it, but can do fine with OOo/LO. 70% of people could pick up OOo/LO and use it with no training or help, even if they are used to MS Office. It is the only fully cross-platform office suite with pretty much the same functionality and interface independent of platform. It reads and writes all MS Office formats (except for Access files.) It does have a drawing program, an XML editor, as well as a math equation editor, in addition to the standard word processor, spreadsheet, presentation tool and database. Oh, and did I mention it\'s free as in beer, too, instead of adding a couple of hundred bucks or more to the price of a new PC? So what are it\'s weaknesses?

  • The spreadsheet doesn\'t yet have all of the functionality of Excel. It\'s pretty darned close, but even I have to admit that Excel is darned hard to beat.
  • ~~It doesn\'t have The Ribbon~~ I think most people would say this is a strength. :-)
  • The presentation package isn\'t quite as polished as PowerPoint, although you can do most of what you can do with PP.
  • ~~It doesn\'t have Publisher~~
  • The database has not come anywhere near the functionality of Access.

The days when many a nonprofit were run by Access databases is coming to a close as things move more and more to the cloud. Google docs will take a good long time to make it to the point where the functionality begins to match either MS Office or OOo/LO, so OOo/LO is a very good alternative to MS Office if you don\'t need MS Access, and have folks able and willing to make a small adjustment to use this tool. I know that the fact that nonprofits can get MS Office for \$30 or so makes a change unlikely, and I\'ve carped about that one for years. But at least, for now, it seems that MS is still willing to be generous.

Continue Reading

My Browser Stats

On 06 Mar, 2011 By mpm With 2 Comments

I was looking at my Google Analytics report for this blog, and came across an interesting thing. The browser share of those visiting my site, and the North America browser share from Statcounter. Here are my stats: {.alignnone .size-medium .wp-image-881 width="300" height="86"} Here are the stats from Statcounter: {.alignnone .size-medium .wp-image-882 width="300" height="193"} It\'s a bit hard to see, but my stats have IE as third, where as the Statcounter stats have IE as out front, by a fair bit. Also, my stats have Chrome in 2nd place, and they have Chrome in 3rd, even with Safari, and a fair bit below Firefox. This falls into the category of \"things that make you go hmmmmm...\"  Although in some ways, it makes sense, given that my audience is much more tech-savvy than the audiences of most websites. (For instance, my personal site, that gets much less traffic, and is likely a less techy crowd, has stats much more similar to Statcounter than this blog.) So, anyway, way to go readers, making Firefox first! And for those 37 of you who visited this year using IE6, shame on you. Be nice to web developers and ditch IE 6, please?  

Continue Reading

{.post-icon .standard}

The Good, the Bad and the Ugly RFPs

On 03 Mar, 2011 By mpm With 2 Comments

In my time working on web development for nonprofit organizations, I\'ve seen more RFPs than I can even begin to count. I\'ve even written a few. And, especially since I\'ve primarily been someone in the role of having to respond to an RFP, I\'ve gotten pretty good at spotting RFPs that I feel don\'t serve either the organization, or the developers well. Here is, in my estimation, the good, bad, and ugly in the realm of RFPs. I\'ll start with the bad. A mistake I see very often in RFPs is an imbalance in what is articulated in the RFP, and the kind of work that is required to pull off what\'s needed. Let me give an example: An RFP for a new website has 2 pages describing in detail needs provided by any modern CMS (web based WYSIWYG editing, drop down menus, new pages easily added, contact forms, etc.) and then a phrase dropped in like \"integration with our CRM,\" or \"event management system,\" without any detail as to what these things really mean (like, what is the CRM and what kind of integration is needed, etc.) This invites a world of hurt, as you can imagine. Kind of like the sound made when the Man from Mars starts eating guitars in the Blondie song. Then there is the ugly. The mistake that organizations most often make is that they have a five- or six-figure imagination, and a four-figure budget. So what\'s the good? What makes a good RFP?

  • Do your homework: know what kinds of software options available to build the kind of system you want, and know what their capabilities are, and how much it generally costs to implement those basic capabilities. Learn about how hard customization of those platforms are (some are much easier than others.)
  • Understand that integration of most any two different systems is going to be four times as hard as you think, cost at least three times as much, and will do 1/2 of what you expect or want.
  • Hire a strategic consultant who really understands technology and the technological details of what you are looking for to help you figure out whether or not you can afford what you really want, and how best to articulate those needs in an RFP. Even an hour or two of their time will save you money and headaches. Someone who is a developer or who has been one in the past is a good bet.
  • Read this slide deck by Gunner of Aspiration!!

Continue Reading

{.post-icon .standard}

Open Source vs. Proprietary: Overview

On 28 Feb, 2011 By mpm With 1 Comments

Since I wrote my post on \"Open Source vs. Proprietary\" last week, and especially after Thomas Taylor\'s very apt comment that the battle is not over in many corners, I decided that, well, what the heck, it was a good time to write a series about open source software options, and their comparisons to proprietary, in 2011, more than 12 years after this whole thing started. And I\'ll highlight where the comparisons are interesting and compelling, especially for nonprofit organizations. I\'ll write a series of posts, and cover the following topics:

  • Desktop software (OS, Office Suites, browsers, utility software, and other good stuff.)
  • Comparisons of open source vs. proprietary development environments (i.e. PHP vs. .NET and that sort of thing.)
  • CMS
  • Non-profit CRM (including \"SaaS\" in the proprietary camp)
  • CRM/ERP more generally
  • Document Management
  • Other web applications
  • Open Source Communities, and how they have changed (and not)

I don\'t know what order I\'ll write about these things - I guess just as the mood strikes me.  

Continue Reading

Changes ...

On 27 Feb, 2011 By mpm With 1 Comments

\"Nothing endures but change\" - Heraclitus Sometimes, change happens when we\'re not looking for it, or we don\'t really want it. Sometimes changes that we don\'t want lead us to places that make more sense for us. This is one of those times. I\'ve been struggling with health issues (life-altering, but not life-threatening, thankfully) for almost 6 months. They have led me to make a significant change in my work life. I have decided to step out of co-running OpenIssue, a business that I helped start more than 2 years ago, so that I can work part-time, to get (and stay) healthy. OpenIssue will continue with its strong team focused entirely on CRM and data (using Salesforce, Convio Common Ground, etc.). I\'ll still be working with OpenIssue on odds and ends moving forward. I\'ll be looking for small projects, or being a piece of larger projects. Please feel free to drop me a line, and please read my page about what I\'m looking for.

Continue Reading

Technology and the Environment

On 23 Feb, 2011 By mpm With 4 Comments

This is an issue I\'ve been struggling with for a long time. I\'m an unrepentant, unabashed technophile. OK, well, not so unrepentant or unabashed since I\'m writing this post on the varied factors around technology and the environment, and have been thinking about this issue for myself for a long time. And I will start this post off by being clear - this is as much of an internal, personal conflict for me, and a way in which I see my own behavior and my values diverge at times. We are approaching a time when just about everyone should be thinking hard about use of fossil fuels, consumption and waste. Global climate change is beginning to affect our lives in a real way. I offer this set of data points, if you will, with the hope that it will spark some thinking and perhaps discussion. Cloud computing There is some argument as to whether or not cloud computing is good or bad for the environment. On one hand, you have folks saying that moving organizational data and functions into the cloud means gaining efficiencies not possible in server rooms. Most organizational servers do not run at full capacity, whereas servers in the cloud generally are using more of their available capacity. One could argue that 100 servers running at 1/2 capacity is better than 200 servers running at 1/4 capacity. That said, we are doing a lot more than just moving stuff out to the cloud. We are creating whole new infrastructures that didn\'t exist before (think Facebook, Google, etc.) But it also means that we can use lighter clients. Will the move to, for instance, tablet and phone computing be a net positive or negative benefit in terms of resource consumption? Green Hosting There are a ton of hosting companies claiming that they are \"green hosting.\" Just Google it. And you might see \"powered by wind power\" or some such. The truth is more complicated. Green hosting companies are just like any other hosting company. They have a big data center that\'s attached to the grid, from which they draw power. And they become \"green\" by purchasing renewable energy credits, or by purchasing carbon offsets. There are some arguments about whether or not these are really helping the environment. Figuring this out is far beyond the scope of this article. But I think it\'s fair to say that the jury is still out on this one. Production and Disposal of Technology Equipment So this is where it gets ugly. I remember, back in the heady days of the early Circuit Rider movement, when one of the big issues was that nonprofits had old, outdated equipment, and they never budgeted for its timely replacement. I remember we talked about planning to replace 1/4 to 1/3 of the hardware in an organization each year. The logic behind this is very hard to fault. Computing changes at a breathtaking pace. Software is written for current high-end hardware, so upgrading software on older machines is either painful or impossible. The argument goes - nonprofits need up-to-date tools to do their work effectively. It all makes sense, but what results is nonprofit technology\'s contribution to e-waste. And as our tools get more and more functional, and slimmer and smaller, and, well, cooler,  we\'re more than happy to toss the old stuff in the trash. We don\'t see or interact with e-waste. We leave that to China, Ghana, and other countries. E-waste pollutes the environment and poisons people. And all because this technology, all of it, is \"designed for the dump.\" (Follow that link, please.) (And, parenthetically, although it\'s not really about the environment, check out this information about Coltan, a necessary ingredient in many electronics, including mobile phones.) And then there are the resources that go into producing our technological gizmos. For instance, it takes 500 pounds of fossil fuel, 50 pounds of metal, and an enormous 1.5 tons of water to make the average computer. That is a staggering amount of resources. And, between phones, tablets, e-readers, laptops, desktops, servers, routers ... it\'s an incredible amount of resource consumption and waste. So what to do? I recently read this article that I found interesting on \"Seven Criteria for the Adoption of New Technology.\" It\'s written by someone who is working at living a simple life, and finds the same kind of conflicts in this that I do:

As the world rushes toward an overcrowded but new and improved grave full of "articulated task lamps" with "industrial style charm," wines with "velvety" appeal, and cordless window shades that are "safe® for children and pets" (that's just one section of today's paper), I find my supposedly simple-living self caught on the same slow slide toward more. The bike I ride now is better than the one I had a year ago. Before long I'll need a new computer, and it will be better than the one I have now. The force of inevitability takes over. What is one to do? How exactly, and realistically, can a person resist, or cope, or somehow do something other than just get swept along? My impulse is to rant.

Here are my modified seven criteria: 1) How would the technology affect dynamics of organizations, friends, family and community? 2) Would it help us live and/or work in more stable circles, and strengthen our communities? 3) Is there a way to limit it, or does it push us down the slippery slope to even more? 4) Would it do "work that is clearly and demonstrably better" than the thing it replaces? 5) Who would want us to get it, and who would not? 6) Would it bring joy and satisfaction to life? 7) Does it represent what we believe in? Thoughts?

Continue Reading

Why Zen?

On 22 Feb, 2011 By mpm With 1 Comments

\"Only the present moment is real and available to us. The peace we desire is not in some distant future, but it is something we can realize in the present moment.\" --Thich Nhat Hahn

I have been thinking about why I decided to call this blog \"Zen and the Art of Nonprofit Technology\" recently. I named it that back in 2006 when I resurrected this blog (pun not really intended) after my time in seminary. If you\'ve read Zen and the Art of Motorcycle Maintenance, you know that what\'s important is the journey, not the destination. Of course this is a common theme, and it\'s one that I try to always keep in mind. For me, the means are the ends, and how we do what we do is easily as important as what we do. The name of this blog was designed to make me keep that in mind when I wrote about technology. As you know, I\'ve given up hope that by helping nonprofit organizations with technology that I\'m changing the world. But I do believe, strongly, that we change the world when we pay attention to what\'s going on now, to how we accomplish what we do. I think I just contradicted myself. Or, rather, entered into somewhat of a paradox. What I\'m saying is that the ends, making nonprofits super duper amazing users of technology, is not what\'s most important. It\'s how they get there that is.

Continue Reading

LibreOffice vs. OpenOffice.org

On 21 Feb, 2011 By mpm With 1 Comments

I hope that everyone reading this blog has heard of OpenOffice.org.  OpenOffice.org is a free and open source cross-platform office suite, which can read and write MS Office .doc, .xls, and .ppt files. It actually has more to it than that, there is a drawing program, a database, a math equation editor and more. It has been in development as OpenOffice.org since 1999, when Sun Microsystems bought the code from a company called Star Division (remember StarOffice?) (You can find an aged, but perhaps useful webinar I did up on slideshare.) For 85% of what most nonprofits (and individuals) need out of MS Office, you can get in this package for free. Sorry, clippy not included. OpenOffice.org has come an incredibly long way since the old days, and it is, now, quite a credible competitor to MS Office. But then ... Oracle bought Sun. And just like the fears that many in the MySQL community have had about the future of MySQL under Oracle\'s watch (Oracle shut down the OpenSolaris project, for example), people were worried about the future of OpenOffice.org. And the cool thing about open source software is that in situations like this, people can fork stuff. And they did. They formed an organization called the Document Foundation, and forked the code from version 3.3 of OpenOffice.org, and called it LibreOffice. All of the major Linux distributions are going to include LibreOffice, some as the default office suite. I\'ve already been using LibreOffice, and intend to stick with it, since IMHO, a good bet is that anything FOSS will flounder and probably die in Oracle\'s hands. (Which is why I am also keeping a keen eye on MySQL drop-in replacements, as well. You\'ll read about that one here.)

Continue Reading

{.post-icon .standard}

Open Source vs. Proprietary. Who won?

On 19 Feb, 2011 By mpm With 4 Comments

This epic battle between Open Source software (or Free software) and proprietary software is coming to a close. Some might argue that FOSS won the battle. Others would argue that proprietary software won. I\'m going to argue that both won, and both lost. The Desktop About 10 years ago, the very big FOSS vs. Proprietary battle was between Linux and Microsoft. The \"year of the Linux desktop\", where Linux becomes a dominant force in the desktop computing world, was predicted, but never came. It never will come. Er, well. No, actually it will, but it will be in the form that no one could have predicted: Android. Android is based on a modified Linux kernel. If Linux had never existed, Android probably would never have existed. Smart phones and tablets are going to be the new desktops, and yes, the Apple iPad is there first, but like the Macintosh and the iPhone, there will be a wave of successor devices running Android that will overtake the iPad in a matter of a year or two, relegating Apple to a niche player once more. And this has moved so fast, that Microsoft will also be a small niche player. Speaking of Apple and FOSS, Apple\'s OS X and iOS are based on the BSD operating system - another open source *NIX that has been around for a while. If it hadn\'t been for BSD, it\'s likely OS X and iOS wouldn\'t be what they are today. The Server to the Cloud Also \'round about 10 years ago, the battle was brewing between Linux and BSD, and proprietary UNIX like SCO and Sun, as well as between Linux and Windows. For a while, Linux (and to a lesser extend, BSD) was winning only against the proprietary UNIX flavors, and Windows servers were heavily favored still by enterprises that needed stuff like Exchange. That was true until ... the cloud. The cloud would not exist without FOSS. There is no way that the kind of inexpensive cloud architecture could have developed if everyone had to have depended on proprietary, licensed software. The cost required to either pay software makers, or recreate everything needed from scratch would have made something like the cloud, or a Google, so expensive as to be impossible. But what\'s also true is that \"the cloud\" is, at its core, supremely proprietary. Not only do you not have access to the code running something like, say, Salesforce.com, but in some cases (such as the case of Facebook) the cloud service providers own your data, too! Even if you wanted to, you couldn\'t download your own copy of Google Apps to run on your desktop. And, at the same time, the cloud provides you with an ever increasing set of features and functionalities, with ever increasing ease of use, at ever decreasing costs. This is both made possible by open source software, and is completely proprietary. So there you have it. Open source software has won. It underlies the bulk of the current technologies we use everyday. And, at the same time, even everybody interacts with FOSS every day, they don\'t (and won\'t) know it. And proprietary software has won, because in the final analysis, it\'s the proprietary layers on top of FOSS that people see and know, even though it depends completely on FOSS.

Continue Reading

{.post-icon .standard}

How to deal with technology change

On 18 Feb, 2011 By mpm With 2 Comments

I saw a call for a ColdFusion developer on an email list I\'m on, and I couldn\'t help but think about technology choice and change, particularly in the website world, and how nonprofits deal with technology change (or, don\'t deal with it.) ColdFusion has been around for 15 years (more than a century in Internet time), and although it has improved and developed, technologically, it has been surpassed by its successors (including PHP, Java, Python, RubyonRails, and even .NET.) But this article isn\'t about CF, it\'s about technology change. Technology is a rapidly moving beast. And the pressure to move forward, fast, is right there, always. It\'s part of our culture, from the advertisements to the neatest, newest, coolest phones, to the new TV you should have. And then there are those of us in the Nonprofit technology community who are constantly on the bleeding edge of the next thing, whether it be hardware, software, or web services, are constantly talking about it, and how it\'s going to make it easier/better/faster to change the world. Although I often get snarky about this, I am aware that I am guilty of this, too. Most nonprofits are not run by geeks. Most nonprofit leaders think of technology as something in between a useful tool to be leveraged, and a necessary evil. They are resistant to rapid changes in their technology, as well they should be. And, they depend on geeks to help them get things done. I have a story about nonprofits with a website they can\'t leverage for their mission. Although complete fiction, this story will feel quite familiar. And I know I\'ve been a guilty party in a real story at some point in my career. A small nonprofit has a small staff who know a lot about their mission, but nothing about how to create a website. A friend of one of the board members is a web developer. They hire that web developer to put a new site together. The developer waxes poetic about the capabilities of this platform, called AmazingWebCreator. They imagine the developer knows what they are talking about. The developer builds the website, then goes away. The organization is happy for a while, they have a website with pages they can easily edit using a web form, which is more than they had before. Then in months, or years, they want to add some new pages, or a new section to their site, and a widget on the side. But they realize they don\'t know how to do that. They call the developer, who is busy now using AmazingWebCreator on some huge project, or has moved on to SuperDuperWebCreator, and doesn\'t have time for them. They have to bring in another developer, who knows AmazingWebCreator, which may cost them time and \$. Of course, the critical factor here is what is \"AmazingWebCreator\"? If it is a relatively new CMS (like WordPress, Drupal, Joomla and others) they may not need to bring in a new developer - they may just be able to get a book, or buy a video to teach them how to use the web interface to create new regions and widgets. If \"AmazingWebCreator\" is a platform like RubyonRails, Django,  .NET, Java, or ColdFusion, they are most likely going to have to hire someone to do that work for them, and depending on the platform, those developers may be either few and far between (ColdFusion) or in high demand, and therefore relatively expensive (RubyonRails, Java.) Worst, of course, is if AmazingWebCreator is a proprietary, custom CMS that the developer wrote themselves in 2002, and no longer supports. How is an organization supposed to know how to make an informed choice about a website platform? I have a few suggestions: 1) Assumptions: First, assume the person/people who develop your site might not be around in a year or so. And assume there are things you can\'t conceive of now that you\'ll want to do in a year. Don\'t assume the platform that your buddy chose for their organization\'s site is the right one for you. Don\'t assume that the most popular platform is necessarily the right one, either. 2) Feature set: Garden variety website, or  very specialized functionality? (By \"garden variety\" I don\'t mean brochureware. I mean average, normal features of most nonprofit organizations. These include such things as donation buttons/pages, membership lists, blogs, etc.) 3) Platform choice: Look at a number of things - if it\'s open source - how many developers are there? How many people use it? How easy is it to find developers? Will most new functionality be able to be added via web interface, or will it require back-end coding? Is it a custom CMS, written, maintained and supported by a single shop? (NEVER, EVER, EVER CHOOSE THESE. Here\'s why. Luckily, they are an endangered species.) If it is proprietary, or software-as-a-service, are the extra features really worth the cost? Are there many consultants and developers who can assist you with this platform? 4) Lifecycle: Is it early in the life of the platform, at it\'s peak, or late (or very, very late)? Bleeding edge might hurt, aged platform might crumble underneath the weight. There are lots of folks (I do this on occasion on this blog, and Idealware is a great resource) that can provide you with information about specific platforms, and comparisons between them. Read, read, read, and ask many questions before you decide.

Continue Reading

Reader solicitation

On 17 Feb, 2011 By mpm With 3 Comments

As you can tell, I\'ve been writing more lately, and I plan, for the time being at least, to really step up my blogging game. I\'ve got a list of posts of my own I want to write, but I realized that some long-time readers of this blog might want me to write about some specific things that fit under my basic purview. Research you\'ve been too busy to do, something you want my unique opinion on, something you\'re curious about. So, I\'m soliciting ideas. No guarantees I\'ll blog about it, but feel free to put in comments (or email me, if you\'re feeling a need for privacy) topics you\'d like me to cover. Here\'s my list of upcoming topics:

  • Has Open Source won or lost, or is the struggle still going on?
  • Updates on Open Content and Copyleft of things other than code that nonprofits might be interested in.
  • Ruby on Rails (varied topics).
  • Drupal Provisioning
  • A beginners guide to NoSQL.
  • Reasons why nonprofits, and nonprofit technology in specific should work to expand the economic models by which they work.
  • How to be really anonymous technologically (for activist reasons) and the Flipside - how to make sure people know you are who you say you are, and what ways do people spoof.
  • Cloud development platforms.
  • Why technology both sucks for the environment, and is good for the environment - how to find the sweet spot.
  • Podcasting 101.

That\'s my list so far, and I\'d love to add your ideas to the mix.

Continue Reading

{.post-icon .standard}

eBooks #2: So you want to e-publish? Mechanics...

On 16 Feb, 2011 By mpm With 5 Comments

As most of you know, I\'m a writer. I write a fair bit of science fiction, and also write other stuff. Lately, I\'ve been thinking a lot about what I want to do to get my novels out in the world, and have been greatly influenced by Cory Doctorow in terms of copyright (or, more accurately, copyleft). Obviously for me, publishing eBooks is going to be something I do at some point, perhaps sooner rather than later. I\'m talking in this post about self-publishing eBooks. What are the options, and how do you go about doing it? Since this is a technology blog, and not a marketing blog, I won\'t talk about the details of getting an ISBN number, or getting nice looking cover art, or getting the word out, etc. There are also avenues that will allow you to sell both your print book alongside your eBook. All of those issues I\'ll leave to other folks. I\'m going to talk here about mechanics of just eBook publishing. Mechanics If you want to put your book into formats that the widest variety of people will be able to read, think about these two important factors:

  • Distribution avenues: Amazon, Barnes and Noble, Google and Apple, would be the ones I\'d focus on, as well as whatever other avenues you want to use to get your electronic files out there.
  • Devices: Kindle, iPad/iPhone, Android Tablets/Android Phones, Nook, Computer (in that vague order of preference)

This will define what steps you need to take to get your manuscript into eBook format. Amazon has what\'s called \"Kindle Direct Publishing\" - how you can self-publish your book on Amazon. There is a lot of information there on how to get going, including how to get your document into the right format. You can either upload a .doc or .docx file, or a mobipocket format file. Unlike Amazon, which puts all self-published and publisher-published books into the same pot, Barnes and Noble has this thing called \"PubIt!\" It\'s a bit segregated from the rest of the books in the Nook store. You need to jump through several hoops to get registered, etc. Once you do, your book can be uploaded either as a .doc or .docx, or you can make an ePub version to upload (more on that below.) Apple also has a system by which you sign up to sell your books in their iBookstore. They also use ePub, so you\'ll need to get your book into that format. Google also makes you join a partner program to publish your book. \"Google Editions\" is the process by which you get Google to sell your book. In that case, you need to upload an ePub version. Nicely, Google allows you to distribute your eBook without DRM, an option no other vendors seem to have. (This is sort of an off-topic digression, but it\'s interesting, so I\'ll insert it anyway. From what I can tell, Barnes and Noble is the only one of the list of these outlets that really distinguishes between self-publishers and regular publishers in the experience of customers viewing/searching for eBooks. In fact, Google and Apple seem to have the same exact back-end process to get books sold, whether you are a big regular publisher, or little ol\' me sitting in my living room...) Formats OK, so now that we\'ve gotten the bureaucratic crap out of the way, how in the bleep do you get your book into the proper format(s)? I wrote a blog entry a bit ago on eBook formats. The distributors above (plus perhaps your own distribution process) necessitates moving your manuscript from the word processing format you wrote it in, to some other format. I\'d say you eventually want it in three different formats:

  1. PDF
  2. ePub
  3. MobiPocket

PDF is easy. MS Word, OpenOffice.org, and LibreOffice (a recent fork of OpenOffice.org - that\'s another blog post) have PDF export facilities, so that job is easy. Make sure, of course, that your manuscript has the right types of cover pages, etc. that are standard for eBooks. (Here\'s a nice, short guide.) There are many methods for converting files to ePub and Mobi format. It depends on your platform, your budget, and your technology skillset:

  • epub-tools: an open source, command line set of tools for conversion to epub format. (free)
  • Calibre: a cross-platform suite of tools for ebook management, conversion, etc. I haven\'t spent much time with it yet, but it\'s free, and seems like it has a pretty nice feature set, including conversion to epub and Mobi. (free)
  • Adobe InDesign: the rather expensive desktop publishing program includes, apparently, conversion to epub (but not to mobi.) (\$\$)
  • ecub: a cross-platform program to convert files to epub and mobi. (free)
  • Jutoh: another cross-platform program that does multiple format conversions. (\$)
  • odftoepub: an OpenOffice.org plug-in for conversion to epub format  (\$)

After you create your files, you\'ll want to look at them, to see if they worked well. Obviously, checking them out on as many devices as you can would be good. You can sideload any of these files to different devices to test them out. You can also use the following tools to independently view your creations:

One small note: Unlike ePub, which is an open standard, Mobipocket was a company that had it\'s own format, and was acquired by Amazon in 2005.

Continue Reading

WordPress vs. Drupal ... fight!

On 15 Feb, 2011 By mpm With 25 Comments

As a user and developer of WordPress since 1.x something, and a developer and user of Drupal since 4.7, I figured that with the release of Drupal 7, this would be a great time to do a comparison of the two.  If you want a really detailed look, please read the very exhaustive, recently released, updated Idealware report on OpenSource CMS, which includes Drupal, WordPress, Joomla and Plone. I did the research for the original report released a couple of years ago, so it\'s been a while since I\'ve come back to comparing these two platforms. Also, this is primarily going to be from the developers point of view, although I\'ll talk some about user interface and experience. (A caveat: I have more experience, especially with larger sites, in Drupal than in WordPress, so there are things that I may be missing. Feel free to make comments on what I got wrong.)

WordPress started out with a focus on ease of use for bloggers and content creators, and secondarily providing a platform for developers to build plug-ins and such. WordPress was born as a blogging tool, primarily, and has expanded outside of that realm, to encompass different kinds of content management use cases. Drupal started out primarily as a web content development platform, with a strength in community features. A focus on ease of use didn\'t come about until Drupal 7. At this point, both Drupal 7 and WordPress are pretty easy for end users to add and edit content, and do pretty simple administrative tasks (moderate comments, etc.) They both have a very nice array of canned themes available to use, and they both have some customizable themes (themes that make it easy to customize without needing to know much HTML or PHP - like Thesis) available. Getting a site up and running in both platforms is pretty easy, although neither are really ready for non-techies to take on. That said, most good webhosts have one-click installs of both CMS platforms. WordPress still has only two content types: Blog Posts and Pages. You can\'t have different kinds of pages, or different kinds of blog posts, or some other content type (news, events, etc.) that aren\'t one or the other. That is a deal-breaker for many kinds of sites. There are plug-ins that allow you to create custom content types - I haven\'t tried these, so I can\'t comment, but it seems a big deal that this is core for Drupal, and an add-on for WordPress. And it seems that this, and the absence in WordPress of a way to easily control the way that lists of content are presented and viewed are the major platform differentiators. That said, many, many websites need neither of these features. And if you want to get more deeply under the hood, both platforms require some understanding of the respective platforms (how plug-ins work in WP, how modules work in Drupal), and probably a bit of PHP, HTML, or AJAX to add bells and whistles to the theme. Given some big changes in the core of Drupal, such as adding fields to nodes, as well as image handling in core, some things are much easier dealt with in Drupal  7 than previous versions, getting close to the ease of use of WordPress in that regard.

Kinds of sites probably best done in WordPress:

  • Blogs
  • Community Blogs
  • Simple brochureware websites

Kinds of sites best done in Drupal:

  • Large community sites where you need different kinds of content generated by users (blogs, wikis, job postings, etc.)
  • Complex, document-heavy library sites, or sites that need document management
  • Sites where you want complex control over multiple content types - how they are created and viewed
  • Magazine/Newspaper like sites where you want to control how lists of content are displayed and ordered
  • eCommerce sites
  • Sites with deep integrations to CRM platforms and web services

Kinds of sites where it\'s a tossup:

  • Medium or large websites with lots of content, but relatively simple organization
  • Community blogs with many authors and identified, authenticated users

Bottom line: They are both such amazing, solid platforms, with rich, deep ecosystems of plug-in/module developers, implementors, designers, etc. that it\'s hard to go wrong picking either platform, as long as you are clear on the feature set needed.  They have rock-solid core development teams, security updates, and over all good code, which you could hardly say about either platform 4 years ago. Also, I have to say, as much as I have respect for other Open Source CMS platforms, IMHO, 98% of websites can be served by either of these platforms. That\'s what\'s true right at this moment. 3 or so years down the pike, I\'m going to be looking at platforms based on Ruby on Rails - as Rails gets more mainstream, and solid CMS platforms start to mature, that will be the space to watch for. But that\'s another blog entry, isn\'t it?

Continue Reading

eCommerce #1: Options

On 13 Feb, 2011 By mpm With 8 Comments

Nonprofits don\'t use e-commerce much,  but I\'ve had some experience (on both sides of the profit fence) doing e-commerce, and for some reason, shopping carts are intriguing me at the moment, and I figure its a good time to know what\'s out there, especially in the open source shopping cart world. What would I use if someone came to me wanting to set up a store? The last time I looked closely at this (which was a few years ago) it was a different situation - there wasn\'t much in the way of open source shopping carts. Today, there are a ton, some better than others. Here are the options I\'ve found:

  • Zen Cart - LAMP stack program, at version 1.3.9. Has a community forum, and seems to be pretty popular. Dreamhost at least has this option as a one-click install.
  • Magento - this is also LAMP stack, and is using the SugarCRM business model (which I will admit is not nearly my favorite) - they have a community version with fewer features and no support compared to the other versions. The other versions seem extraordinarily expensive - (\$3,000 - \$13,000 per year. I\'m assuming for that we\'re talking high-end shopping cart system.)
  • Ubercart - Ubercart is a module of Drupal, and the one of the bunch of these that I have the most experience with. Because it is a Drupal module, all of the vast array of features available with Drupal are right there - so the shopping cart system doesn\'t have to have them. This is a big plus.
  • Open Cart - Also LAMP. Like Ubercart and Zen Cart, this is a truly open source community effort, with an ecosystem of providers rather than a business model.
  • PrestaShop - Also LAMP. More like Magento in business model. My pet peeve: a form for downloading software that requires you to put in your email address. Hate that. Most add-ons for PrestaShop cost money.
  • OSCommerce - Seems to have most of their popularity in Europe. LAMP stack program.
  • Spree - Open Source Ruby on Rails eCommerce program. It\'s younger than most of the list above, but interesting.

There are others, but they are much less popular, and much less feature rich - not much reason to choose them at this moment.  There are several WordPress shopping cart plug-ins that seem worth looking at, if you are wedded to WordPress. Some open source, some not. Of course, so much depends on how much you want to sell, how you want to sell, what you want to sell, how important your store is in comparison with the rest of your website (for instance, do you set up an entirely separate store, or use something like Ubercart as part of your website.) Those questions are going to be key to know the answers to before you compare features and technologies of these systems to make a decision.

Continue Reading

What IPv6 means to you

On 13 Feb, 2011 By mpm With 1 Comments

For those of you that don\'t know about IP addresses, here\'s a very quick lesson. In order for one computer to talk to another computer on the internet, it needs an address, the same as you have an address so that people know where to send you junk mail catalogs. Human beings suck at remembering numbers, so a system of connecting names to numbers exists (called the Domain Name System, or DNS.) But the core underlying structure is computers talking to each other via numbers that run from 0.0.0.0 to 255.255.255.255. This system is called IPv4, or Internet Protocol Version 4. If you\'re quick at math, IPv4 has 2^32^ possible addresses. That\'s 4,294,967,296, four billion, plus. That\'s seems like a lot. But guess what? It\'s not nearly enough. Certainly not enough for a world with increasingly connected devices - things we\'d never considered 20 or so years ago, like your TV and your refrigerator, let alone millions and millions of cell phones. Right now, most people in the US own at least 2 or 3 devices that need an IP address - your computer, your laptop, your phone, your tablet, your cable box, etc. We\'ve known for years (since the 80\'s) that we\'d run out of IPv4 addresses sooner or later. Well, later has come. IANA (Internet Assigned Numbers Authority) has now given out all of the IPv4 address blocks it has. Unallocated IPv4 addresses will run out in August of this year. Yes, this year (right after my birthday, in fact.) So what\'s next? What do we do in this situation? In comes IPv6. It\'s a new and improved internet Protocol, IPv6. IPv6 has a different numbering scheme. It is in hexadecimal. Addresses range from 0:0:0:0:0:0:0:0 to ffff:ffff:ffff:ffff:ffff:ffff:ffff:ffff.  This allows for 3.4 x 10^38^ addresses. Officially, that is 340 undecillion, but that\'s really gazillions and gazillions. But IPv6 is no longer in the future the way it used to be. It\'s NOW, and you have no choice but to deal with it. Luckily, most of you reading this blog shouldn\'t have to worry too much - although if you have older hardware (computers, cable modems, routers, etc.) you may be in a bit of trouble when your ISP does the switchover. For most folks who have reasonably recent hardware, the issues sits entirely with your ISP.  June 8, 2011, is being called \"World IPv6 Day.\" If you own a website (VPS or server inside your firewall) you may well have work to do. Check with your hosting provider or ISP to find out what you need to do to make sure you\'re ready.

Continue Reading

Drupal 7

On 24 Jan, 2011 By mpm With 3 Comments

I\'ve had a bit of time now to work with Drupal 7. I\'ve been playing with it since it was still pretty experimental, but I finally put together a whole site with it recently, and am pretty happy with it. It\'s gotten a big leg up in terms of usability - this was a major focus for this release. The basic user interface is much improved over Drupal 6, and unrecognizable if you\'ve only been using Drupal 5 or earlier. In my opinion, the advantages some CMS had over Drupal (I\'m thinking specifically of Joomla and WordPress) in the realm of usability have been diminished or even eliminated - especially in the case of Joomla. WordPress still has a usability advantage if you are creating a simple blog site. But if you\'re using WordPress to make a more generic website - I\'d look twice or three times at Drupal 7. There are also some pretty serious under the hood improvements as well. I\'m looking forward to when all of the modules that I depend on are up to speed. Some of the most important ones made it into core, and many others were released with Drupal 7. Some others are close, like Ubercart, which has a version in Beta, and Views, which surprisingly is still in Alpha. If you are running a site in Drupal 5, you need to migrate your site at least to 6, but possibly to 7, if you can. Drupal 5 is no longer supported - which means that if there are security issues, they won\'t be addressed. There will be no more updates to contrib modules in Drupal 5 (in fact, many modules for Drupal 5 were abandoned a while ago.) Lullabot has a nice article on upgrading your site from Drupal 5 to Drupal 6, in case your site isn\'t ready for Drupal 7.

Continue Reading

{.post-icon .standard}

eBooks #1: ePub is to eBooks as MP3 is to music?

On 21 Jan, 2011 By mpm With 3 Comments

If you\'ve been around the block as long as I have, you remember the days before an audio codec was settled upon. EBooks are moving into adolescence, and the question is, which format will win, or does one format have to win? For a while there, the two big players on the field were Amazon on one side, with it\'s Kindle and proprietary format, which is an offshoot of MobiPocket format, and a reader that has a fairly limited range of formats it can read. On the other side was Barnes and Noble, with the Nook, and it\'s own format which is based on ePub. Both Amazon and Barnes and Noble have DRM in their book formats. And, of course, they aren\'t the only players in the field. Sony has it\'s own reader and format, although, like the Nook and the Kindle, can read a variety of formats. I won\'t go into exhaustive detail here on the wide variety of readers and formats. There\'s a great Wikipedia article to do the work for me. What I want to talk a bit about is what\'s next. Amazon and Barnes and Noble were all geared up for a big fight, until a number of things happened to change the equation. First, Apple came out with the iPhone, and both Amazon and Barnes and Noble released software that allowed you to read the books you\'d bought on that device. Soon after came Android phones, and the same thing happened. Now, you didn\'t need to own one of their devices to read books sold in their store. Next, Apple released the iPad and Bookstore. It provides some serious competition to all three of these established players. On one hand, Apple gave it\'s very popular  iTunes Store like treatment to books. On the other hand, as most people who have read books on a reader like the Kindle know, a device like the iPad is actually not the most optimal kind to read books for long stretches of time. The truth, is, though, the iPad became the second most popular device to read eBooks in a very rapid amount of time, mostly to the detriment of the Kindle. All other readers have tiny market share in comparison to both of those players. But with the soon to be plethora of Android tablet competitors (as well as tablets using E Ink) to the iPad, and the ability to read Amazon, Barnes and Noble, and Sony books on Android, it does seem that there isn\'t a huge need to come to any sort of standard. But then, Google enters the fray, in a bigger way that just with Android. The Google eBookstore! Google decided to go with the ePub format for their bookstore, with Adobe DRM. Because they did that, users who buy books at the Google eBookstore can read those books on just about every device except ... the Kindle, which does not support ePub. So what happens next? This is my bet, although it will be interesting in a few years to find out whether I\'m correct: - Amazon and Barnes and Noble leave the hardware manufacturing biz when inexpensive, credible, good e-ink Android-based devices come out - those become the standard devices for eBook reading. (NB: The Nook by Barnes and Noble is actually based on Android itself, but I still think B&N will exit the hardware biz.) - Sony\'s eBook reader and store dies because no one uses it. - Google becomes second only to Amazon in eBook selling, eventually toppling Apple from #2 spot. - Apple, like always, remains the stylish, expensive niche player. They don\'t have the same success with eBooks that they did with music. - ePub and Amazon\'s format both remain viable for years to come. Other formats wane in importance. Next up, a blog post about what you should do if you want to self-publish your book as an eBook.

Continue Reading

Email is dead ... long live Email?

On 27 Dec, 2010 By mpm With 3 Comments

Reports of the death of email are, of course rampant, for the past, oh, 10 years or so. First, spam was going to kill email. It certainly is true that most email sent these days is spam, but that hasn\'t managed to kill it. More recently, facebook and twitter were considered likely candidates for killing it off. Recent studies suggest that there is a demographic shift happening - social media being more primary communications avenues for Millenials and Gen Y, and email for everyone older. I was having lunch with a friend this week who runs a small advocacy organization in Canada, that relies heavily on the use of email lists. The question arose as to whether or not it was worth thinking about shifting some of that communications traffic from email lists to the web in some form. I think that\'s a big question - it matters a lot what the demographics of the organization are, what the goals of the email lists are, and the direction things are going. Also, of course, lots of people are beginning to say - go where the people are, which is increasingly Facebook. I tend to shy away from that suggestion - putting all of ones eggs in a basket you have no control over always seems dangerous to me, even if the basket is free. I decided it was time to do a roundup of the discussion and collaboration alternatives that exist at this point. I\'m sure I\'ll miss some, so please add more in comments. I\'m focusing on tools that are free and open source - tools that you would install on your own server, or your own private cloud. Someone else can round up the free services. 1) Drupal Organic Groups: Organic Groups is a module of Drupal, which allows for discussions, wikis and the like. It\'s a very powerful tool, and combined with other drupal modules, there is a lot you can do with it. A distribution which includes OG and other tools is called Drupal Commons, and is worth a look - it\'s matured quite well. 2) Elgg: Elgg is a powerful open source social network tool. I\'ve blogged about Elgg before. It\'s quite a powerful tool, and has everything you\'d want in a private social media site - friending, profiles, groups, collaboration tools, etc. It\'s modular, themable, and has a very active development community. 3) Etherpad: I\'m including Etherpad because although it\'s more of a collaboration tool than a discussion tool, you can have live chat at the same time as you are collaborating. It works best for larger groups, but it\'s a pretty amazing tool. There are some hosted versions you can try out.  (And from that link, you can see that Google acquired the company behind it, so some of that technology has made it into Google apps - but you can still get the EtherPad code.) 4) phpBB: phpBB (BB stands for bulletin board) has been around forever, and is still kicking. 5) BuddyPress: BuddyPress is a plug-in that adds features to a  WordPress MU site (multi site), such as profiles, groups, friending, etc. 6) Redmine: Redmine is, as you should know by now, my absolutest favorite project management software - it blows out of the water every other one I\'ve ever tried (Basecamp, Central Desktop, MS Project, ActiveCollab, etc., etc.) It\'s strengths are ticket tracking and such, but it does have forums that work quite well. 7) Crabgrass: Crabgrass is a lot like Elgg, except it is written on Ruby on Rails, and also has a specific orientation to focus on groups that do grassroots organizing. 8) Diaspora: Diaspora is a Facebook alternative. You can, apparently run your own Diaspora server.  I haven\'t gotten my Diaspora account yet, so I don\'t know the feature set well, but I\'m assuming there will be groups and group discussions available.

Continue Reading

Salesforce.com and Ruby on Rails

On 17 Dec, 2010 By mpm

Programming languages and I have issues. By now, I\'ve learned quite a number of them (I think 9 by last count), but for some reason, I seem to choose my work on them just at the top of the curve, or as they are going down. I have yet to manage to pick one early. I learned C at the height of its popularity, just as C++ was beginning to rise. I learned Fortran when it was almost dead, mostly for fun. I learned Pascal toward the tail end of its reign. In the late 90s, I chose to write a CMS in Perl instead of PHP. Dumb idea. I\'ve been moderately interested in Ruby and Rails for years now, although I haven\'t yet spent very much time getting my hands really into coding Ruby. As pretty much all of you in the Salesforce.com world know, Salesforce.com agreed to buy Heroku for a pretty big chunk of change. I\'d played with Heroku a little a while back, and I thought it rocked. What is Heroku? Heroku is cloud Ruby on Rails. Build a Rails app, and deploy it on Heroku. It\'s pretty sweet. So why would Salseforce.com buy it? On one level, it makes über sense to me. As someone who has managed to learn some Apex, which is, frankly, somewhat of a monster of a programming language, it\'s pretty clear that it\'s not super easy to build complex apps using it. It\'s like Java in heavy chains. A well-joined RoR & Salesforce.com platform, all in the cloud, would simply rock. (In case you are wondering, there already is a Ruby toolkit for the Salesforce API, although it looks like it only works on Rails 2.3, not 3.) One another level, it\'s fascinating. The culture of the Ruby and Rails world, the open source, community-driven, gift economy meritocracy, is very different than the Salesforce.com world - proprietary, business oriented, certifications-focused world. Of course, these are stereotypes - there are plenty of business-oriented Rails folks, and plenty of open-source oriented Salesforce folks, but the worlds really are culturally very different. I\'ll have a post soon where I talk in detail about why I think open source has both won and lost the open source vs. proprietary war, but this particular intercultural marriage will be interesting to watch. And the great thing is that our company has had such a marriage for a couple of years now, and it works. Anyway, I\'m dusting off my Ruby books, and diving in. Fun times!

Continue Reading

Plotting my return to Twitter

On 28 Nov, 2010 By mpm With 2 Comments

In April of this year, I left twitter. I had good reason to leave twitter. And, after a few months, I didn\'t miss it. And, frankly I still don\'t miss it. But I had a bit of an epiphany lately that you social media mavens out there will very much appreciate. I figured it was worth writing on this blog about. I joined Twitter in the beginning, because my colleagues were. I didn\'t have a reason, or a goal, except to find out what everyone one else was, well, all a-twitter about (sorry, I couldn\'t help it.) I knew that my nonprofit consulting practice was not going to be geared toward social media (as you all know, I veer way more to the plumber end of the web technology spectrum.) And, it was fun, for a while, then it got old. I didn\'t have a specific set of things I wanted to get out there in the world (save in the realm of what I can easily do by blogging) and I just joined because all of my nptech buddies joined.  I got overwhelmed by the information coming my way and it invaded my life. So I left. What\'s changed for me is that I now have a goal and a focus, and with that goal and focus comes a realization. Aha! Twitter will be useful. It sort of took me by surprise, interestingly enough. I began to think about how I would approach this thing, and what would be the best way to learn more, as well as share, and put stuff out, and ... voila, Twitter. And the lesson, I learned, which I\'m sure lots of nonprofits are learning, seems to be: Twitter is a means to an end, and it\'s important for me to treat it that way, rather than it being and end unto itself. And I know the social media folks have been saying this all along, but it took me this long for it to really sink in. I know that at least some of you are thinking \"so what\'s the goal and focus?\" Sorry, it\'s not nonprofit technology, ya\'ll. Now that I\'ll be back on Twitter, I\'ll probably do a few tweets now and again from our company twitter account, so feel free to follow. And please don\'t feel at all slighted if I stop following you on my personal twitter account (It\'s likely.)  Because besides being a web techie, I\'m a science fiction writer with some stories and novels to peddle.

Continue Reading

A couple of tidbits

On 15 Nov, 2010 By mpm

A couple of tidbits on the environment today:

Continue Reading

Leaving Apple Behind

On 11 Oct, 2010 By mpm With 5 Comments

I\'ve been through a pretty interesting transformation in the last 2 months. I\'ve gone from being a Mac/iPhone user, to being a ThinkPad/Android user, and not looking back. I\'m actually quite happy - I can run both Windows 7 and Ubuntu Linux on my laptop, and I like Android (and my Droid 2 phone) a lot.  Once I sell my iPhone (fairly soon) I will be free of Apple hardware for the first time in 25 years (yikes! Er, well, actually, I think there were a couple of years there where I had only a Power Computing Mac Clone.) I made the change for two reasons - first - I really needed Windows for work. There are some tools that our team uses that I need to use that have no Macintosh version or alternative - and collaboration with our team using MS Office tools has become so much smoother. I\'m surprised at how Mac-like, and trouble-free Windows 7 is (I haven\'t had windows on a machine I owned since Windows 2000.) It\'s basically unobtrusive as an OS (well, besides the annoying pop-ups asking for permission for things during installation, but that seems a worthy trade off to the virus and worm-laden alternative.) And the funny thing about using Windows is that I have now noticed how much work I used to have to do to make sure things were going to work with my Mac. That\'s not a problem I\'ve been facing anymore. And, of course, using Ubuntu on the desktop is fun. Great web development environment, of course. Some things (like adding peripherals especially) are still a little problematic, but it\'s getting better. The second reason was more philosophical. The release of the iPad (which I will never buy - waiting until there is a credible Android tablet) sort of woke me up. I had come to chafe at the closed-ness of the whole system, but somehow the ease of use in comparison to the alternatives was addictive for a long time. But the iPad, as simple a consumptive device, and a closed one at that, sort of made me realize that coolness of design wasn\'t worth the trade off. And now that I\'ve spent some time with the alternatives ... well, yes, Apple products still have a bit more gloss, but the usability for me is not hugely better than either of the 3 alternatives I\'ve been working with (which, in the case of both Windows and Linux, was not always the case.) I actually like my Droid phone better than I ever liked my iphone, which came somewhat of a surprise to me. I\'ve always really liked ThinkPads, and have owned a couple over the years to run Linux on. It does feel a bit weird to use Windows sometimes, considering my years as an open source advocate. But I don\'t think Apple is any purer, really, they just have less market share. (Although they did make the right choice in basing their OS on Unix - wish MS would have done that - although they might have frakked that up.) What\'s true is that I\'ve given up one corporate behemoth, to fully embrace 2 others - Google and Microsoft. A scary amount of my data (mail, contacts, photos, task lists, calendars, phone etc.) resides in Google\'s data centers. I use their software every day (Android, Chrome, Picasa, Google Earth). I\'m not quite sure how I feel about that. What will I miss? I don\'t miss my iPhone at all (especially now that Angry Birds is on Android). I\'ll miss Garage Band. I actually think that\'s it, at least, after using a PC exclusively for the last month, that\'s all I can think of.

Continue Reading

Salesforce as a CMS?

On 22 Sep, 2010 By mpm With 4 Comments

Salesforce is a very powerful platform onto which one can build a large variety of interesting kinds of custom applications. I\'ve already talked on this blog about Salesforce integration with Drupal, Plone, and others. Today I\'m going to delve into Salesforce-based CMS systems - systems built as applications on top of the Force.com platform. First, what are the advantages and disadvantages of this approach? The disadvantage is that primarily, Salesforce was not designed as a CMS - it was designed as a Salesforce automation and customer service tool. It has become a powerful platform, and there is a lot you can do with it - but it was never designed with content or visual design in mind. What are the advantages? If you\'re running database applications (tracking donations, events, programs, clients) and want deep integration between your web content and your data, it is an approach that is hard to beat. Certainly CMS/CRM integrations can go a long way - but ultimately, using Salesforce as your CMS platform will provide a kind of power that is not easily replicable using an integration. But with that power, may come some sacrifices. What are the options for doing CMS-like things on the Force.com platform?

  • The native capability of something called \"Sites\" - which is a publicly facing version of what\'s called \"VisualForce\" - a markup language that includes HTML as well as APEX code (Force.com coding language). This requires a lot of custom code, and becomes unwieldy when you get to more than a few pages unless you write a mini-CMS yourself to handle things as a site gets more complex. But there is a lot there.
  • CMSForce - This is an \"open source\" (in quotes because although you can get the source code, do what you\'d like with it, and contribute to the project, because it\'s written on a proprietary platform, it\'s not really open source.) I\'ve spent quite a bit of time with this one, and more to come, I\'m sure - like any open source project, there is a lot that could be done to make it more usable. But it certainly is something to evaluate, and contribute to, if you find it useful. It is written by Force.com Labs, so it\'s got serious Force.com developers behind it.
  • OrchestraCMS - This is a paid app - with discounts for nonprofit organizations. I\'ve taken just a test drive, but it\'s pretty impressive - it has it\'s own UI, and is well developed. There were a few hiccups in getting going, but I suspect it was because I only spent a little time with it. A partner we work with has done a lot of work with this application, and we\'re pretty interested in it.

There are a couple of others, and I\'m sure more in development. Salesforce has a rich enough data model and development platform to sustain a solid CMS - the big question is - is this the right fit in terms of integration? Salesforce-based content management is embryonic in comparison to CMS systems such as Drupal or Plone (or even WordPress for that matter) but being able to draw data directly in and out of salesforce very easily, for some organizations running Salesforce, might well be worth it. And, it\'s also possible to have one\'s main site in a solid CMS, and instead of using complex integrations, have a mini-site with the same look and feel based in Salesforce, for the data needs you have. Again, it depends on what your use cases are, but that\'s another way to go.

Continue Reading

Does Social Media Work?

On 13 Sep, 2010 By mpm With 1 Comments

I know for many of you this is old news. But since I\'m not on twitter anymore, and I don\'t read my RSS feeds as often as I should. In July, Idealware published the Nonprofit Social Media Decision Guide. It\'s great - chock full of good information, and some very, very interesting research. One of the most interesting tidbits of data to me was the large gap between people who \"thought\" social media of varied types either helped them reach new audiences, or helped them raise money, and those that really \"knew\" this was the case. And further, the largest change was just an increase in website traffic (20%).  A very close second was substantive feedback and discussions (21%), and a relatively close third was to attract new members or volunteers (16%). There are some great worksheets to help you figure out what strategies to use, and how to move forward in this space. And there is, to my mind, a lot of fodder for thought and conversation among folks thinking about  how to really measure success in social media, as well as those of us thinking about SocialCRM:  how to best capture that data - whether it be engagement metrics, or actual constituent information.

Continue Reading

Women Who Tech Telesummit

On 31 Aug, 2010 By mpm

I\'ve been involved in this Telesummit now since the beginning. It\'s really fun, and important.

****
The [Women Who Tech Telesummit](http://www.womenwhotech.com%20) was formed three years ago to celebrate all the innovative women who provide incredible value to technology and social media. So it\'s time to [come get your tech on](http://www.bit.ly/womenwhotech)!

Come join hundreds of women on September 15th at the Women Who Tech Telesummit from 11AM to 6PM Eastern Time. (It's virtual - all you need is access to a phone line and the web so you can participate from anywhere in the world).

Women Who Tech's thought provoking virtual panels offer the latest resources and tools for launching a successful startup, tools and apps to build your online community, Social Media ROI, and more.

Among the sessions:

  • Launching Your Own Startup
  • Creating a Culture of Collaboration and Innovation
  • Female Ferocity
  • ROI of Social Networking
  • Speak Up: Pitching and Public Speaking Mojo
  • Building the Ultimate User Experience
  • Women and Open Source and Identity

And more...

Panelists include a "who's who" of women on the forefront of social change and technological progress, among them: Elisa Camahort Page, Co-Founder of BlogHer,

Rashmi Sinha, Co-Founder of SlideShare, Beth Kanter, Blogger and CEO of Zoetica, Cheryl Contee of Fission Strategy, Shireen Mitchell of Digital Sistas, Genevieve Bell of Intel, Deanna Zandt, technologist and author, Liza Sabater of Culture Kitchen, Tara Hunt, Author, Lynne Johnson of the Advertising Research Foundation, and Heather Harde, CEO of TechCrunch.

They are also [hosting after Parties](http://bit.ly/djFEi2) in DC, NYC and SF so come on out!

Continue Reading

When data gets political

On 27 Jul, 2010 By mpm With 13 Comments

Most days, data is pretty straightforward to us here at OpenIssue headquarters. Names, addresses, email addresses, the pesky notes field (today\'s bane of our existence.) But sometimes, data is political. Or, I guess more accurately, data models.

In most CRM systems, especially older ones, and ones that are less flexible, some fields can be points of contention for some of us. Gender is one, marital status is another. CiviCRM, to it\'s credit, allows for an arbitrary number of genders - you can define them however you like. My bet (although I could be wrong) is that it\'s one of the few out there that allow that. Gender is not a standard field in Salesforce.com contact records, so if you want to add your own, you can customize it however you\'d like. There was a very interesting and lively discussion about the gender field in Drupal profiles. Of course, one can always customize these things in Drupal. For a couple of projects we\'ve been working on, we\'ve been getting very interested in putting together a really expanded and fleshed out data model for gender, sexual orientation, and marital status. Here\'s the first draft. We\'d love feedback on this (besides \"this is silly/too radical/dangerous/from the antichrist/etc.\"). And we also know that even for those who agree that sex and gender are different things, people will differ on how to divide these categories and make sense of it.

  • Sex: Male, Female, FTM, MTF, Intersex
  • Gender: Male, Female, Genderqueer
  • Sexuality: Gay, Lesbian, Bisexual, Queer, Questioning, Straight
  • Marital Status: Straight Marriage, MA, DC, IA, VT Domestic, CA-SF 2004, CA 2008, Canada
  • Relationship Status: Single, Partnered, Divorced, Dating, Poly  (There probably could be some field dependencies of Marital Status on Relationship Status)

And if you maybe thought that OpenIssue headquarters was in San Francisco, I\'m sure this list made you sure. (Yes, we are.)

Continue Reading

Three months without Twitter

On 19 Jul, 2010 By mpm With 3 Comments

As you know, I left twitter 3 months ago today. I figured it was a good time to do a reflection of my experiences over this time - what I miss, and what I don\'t miss. What I don\'t miss

  • Distractions: I find myself more productive, for sure. I never was very disciplined about turning twitter off, so I was constantly distracted. The lack of distraction has been a really good thing.
  • Information overload: how did I keep all that stuff in my head? My mind feels a lot quieter.
  • Need to share: I\'m happy to leave the somewhat narcissistic impulse that Twitter feeds behind.

What I miss

  • Instant answers to questions! And answering people\'s questions.
  • Banter: Twitter is way better for banter than any other medium besides being in person.
  • Opportunities for collaboration: it does seem like a lot of that happens now on Twitter and Facebook (which for me is a friends/family only zone) so I\'m probably missing out on some of that.

In general, I\'m still happy I left, and have no plans to return. I have, on a couple of occasions, used search.twitter.com when a certain event was happening, so I could see up to the minute what was going on. I\'m sure I\'ll still do that sometimes.

Continue Reading

Git

On 19 Jul, 2010 By mpm

I became sold on version control fairly far back in my programming life. Back when CVS (C0ncurrent Version System) was the standard. I learned it, although there were varied gaps in my use of it, so it never became second nature. As I learned more about newer version control systems, I tried them out. For a while, I was using SVN (Subversion), which is similar enough to CVS, but has some nice improvements. More and more folks are moving to distributed version control systems. I began to understand the great advantages of those systems, and decided to pick one to standardize on. Git stood out from the others in terms of popularity and resources. And, I figured anything Linus Torvalds wrote was good enough for me. That was last year. This year. drupal.org is moving to Git, making my life oh so much easier. In my daily life, Git has 2 major advantages: version control and comparison of versions even when I\'m not connected to the internet (you have your own actual repository, not just a working copy), and its speed. It takes less time to clone a whole repository of code than it does to check out a working copy using CVS or SVN! It\'s really worth checking out. I imagine Git will become the new CVS - the new standard, until something better comes along to supplant it.

Continue Reading

I\'m not changing the world

On 02 Jul, 2010 By mpm With 5 Comments

I\'ve been working with nonprofit organizations on technology issues (strategy, implementation) for about 15 years now. I remember the heady days, when most nonprofits didn\'t even have networks, and some of them still didn\'t have internet access. In those days, most nonprofit techies were progressive, and we were sure that what we were doing was going to change the world for the better. Now, 15 years later, I\'m pretty sure I\'m not changing the world. You\'re still more likely to find a progressive nonprofit techie than a conservative one, but there are plenty of conservative ones now. Conservative causes of all sorts have discovered the power of the kinds of technologies I\'ve been helping nonprofits with, and are au courant. Plenty of conservative organizations use Drupal, Salesforce, online fundraising, Facebook and Twitter - using those technologies to push for ends that I am far from interested in seeing come to reality. You can bet that the 2012 and 2016 presidential elections will not be a repeat of the 2008 election with such a massive differential in use of technology and social networks. I remember also, from those heady days, the idea that we could help nonprofits be more effective by encouraging them to be more proactive around replacing their hardware. Come to find out not so much later, that the massive production (and disposal) of computer hardware fuels deadly conflicts, and causes serious environmental damage. And then there is the fundamental - what is all this technology really for, anyway? I was reminded of this when listening to Marketplace on radio a while ago. It\'s worth remembering that one of the two motive forces around all of this technology change is that business (and nonprofits, too) can squeeze more work out of fewer people. That would be fine if we had a great safety net where people who were unemployed could be supported, and perhaps get free education so they could create art, music, or new and interesting things, but that\'s not how the system works, is it? The second motive force is simply to empty your wallet so you can get shiny. I still think I\'m doing good. I still think that working with nonprofits to help them grapple with communications and data is good work, helps people, and is right livelihood. But I\'m pretty sure I\'m not changing the world by doing it. I\'m reminded, of course, by the famous Audre Lorde quote:  \"The master\'s tools will never dismantle the master\'s house.\" There may be other ways I\'m helping to change the world, though, but you\'ll have to read my other blog for that.

Continue Reading

{.post-icon .standard}

What Drupal and Salesforce have taught me about coding

On 30 Jun, 2010 By mpm With 1 Comments

I\'ve been spending a fair bit of time in the last couple of years learning to code in a new way. It reminds me of a transition I made in coding from having written stand-alone applications for varied computers, to writing code for the web. When I was in college, grad school and early in my academic career (this dates me - from the early 80s to early 90s), I spent a lot of time writing stand-alone applications, mostly in Pascal and C. The shift from that kind of code, to writing for the web was a lesson in protocols, constraints, and different ways of troubleshooting. The transition from writing free-form web applications, to writing modules for Drupal, or APEX customizations for Salesforce, is another set of lessons in protocols and constraints. First, it\'s not enough to understand the syntax and form of the language (this is especially true for APEX - and beware the required test coverage!) One has to understand how the surrounding application works - what APIs or methods one can use, and how. And unlike long standing languages, there aren\'t lots of detailed cookbooks and that sort of thing lying around - a lot of it is learning from other folks, as well as just learning by trial and error. And, in my small forays into learning frameworks like CakePHP, Ruby On Rails, and others, it seems like these days, coding for the web is many lessons in constraints - which is a good thing, I think. Even though it feels like beating my head against a wall, it\'s nice to know that I won\'t \"dump core\" and break Salesforce (although I for sure have broken Drupal on occasion!)

Continue Reading

Amazon S3 for web server backup

On 30 Jun, 2010 By mpm With 4 Comments

I\'ve been getting to know Amazon S3 lately, and there are some great things about it. I think it is one of the long list of unpredicted successes that resulted from the near-ubiquitousness of open source software on the server side. We\'ve been using it for \"offsite\" backup for drupal sites for a while now. We have a script going which runs by cron daily to do the backups. There are a number of ways to do this. We started using S3fs as a way to mount an S3 bucket in the filesystem, then just copy the files to S3. One of the scripts we\'ve use is here. (We also use rsync.) However, S3fs isn\'t very actively supported or in development. So we\'re thinking of moving to use S3cmd, which works really well, and is still under active development.

Continue Reading

On 21 Jun, 2010 By mpm With 1 Comments

Again, a little peak at what I\'ve been up to, reading, and thinking about. You can also see what I\'ve been reading by looking at my shared items on my google profile.

Continue Reading

Social CRM, part 2: Metrics vs. CRM

On 29 Apr, 2010 By mpm With 2 Comments

So while I\'ve been off twitter, I\'ve had time to research social CRM (funny, that.) And what I\'ve found is pretty interesting. CRM stands for \"Customer Relationship Management\" (not to be confused with \"Cause Related Marketing\"- it came from the for-profit space. In the nonprofit world we use this acronym to mean \"Constituent Relationship Management\", generally. From Wikipedia:

Customer relationship management is a broadly recognized, widely-implemented strategy for managing and nurturing a company's interactions with clients and sales prospects. It involves using technology to organize, automate, and synchronize business processes---principally sales activities, but also those for marketing, customer service, and technical support. The overall goals are to find, attract, and win new clients, nurture and retain those the company already has, entice former clients back into the fold, and reduce the costs of marketing and client service.

Now we could easily translate that into \"managing and nurturing an organizations\' interactions with donors and constituents.\" and \"overall goals are to find, attract and win new donors, nurture and retain those donors the organization already has, entice former donors back into the fold, and reduce the costs of fundraising.\" (I\'ve never been convinced that CRM and Donation management are very different beasts, even though many argue differently.) Anyway, you all know this stuff, and know the tools we all use to do this - Salesforce, CiviCRM, Raiser\'s Edge, etc. And these tools are great at doing CRM with the standard communications methods - email, phone, snail mail, in person contact. But what about social media as another form of communication? That was the question I cam to this issue with. There are good arguments for why social media will radically change standard CRM practices. You should definitely read the report I mentioned in my earlier post. But in the Social CRM space, there seems to be a lot more attention paid to what I would call \"metrics\"  - useful for attracting new donors, and understanding the \"emotional state of conversations\" rather than relationships that are trackable to \"nurture and retain those donors the organization already has.\" I don\'t mean to downplay metrics - metrics are hugely important - but I think mixing up metrics and CRM might make it harder to really do either well. Example - in Jeremiah Owyang\'s report, of the 18 use cases for Social CRM he uses, 7 or 8 of them are really use cases for metrics. Example \"Social Campaign Tracking\" and \"Social Sales Insights.\" In this series, I\'m going to talk a fair bit about both, although I\'m going to lean  more heavily on the CRM side of things than the Metrics side, since that\'s more my bailiwick anyway. And I welcome any comments.

Continue Reading

External, alienated, busy-busy

On 19 Apr, 2010 By mpm With 11 Comments

As you might know, almost a year ago, I made a big change in my use of social media - I segregated my social graph - work related stuff moved to LinkedIn and Twitter, and personal friends only on Facebook. Now, I have taken the next step, and made somewhat of a momentous decision. I\'m not alone - Jon Stahl did this before me, and I know there are others. There are plenty of people who never entered these waters at all. I have been fairly conflicted about this for a while. There are things I really like about Twitter, Identi.ca, Buzz, etc. I like being connected to the nptech community, and learning what\'s happening. I really like reaching out and getting questions answered. But, being on those networks has taken it\'s toll on me. It\'s time spent I need for other things. It\'s an influx of information in my brain that I really don\'t need. And I\'m sure people really don\'t need to hear what I think or what I\'m doing in 140 characters or less. Most of the reason I named this blog \"Zen and the Art of Nonprofit Technology\" is that I am very interested in the ethical and spiritual dimensions of technology in general, and nonprofit technology in particular. And I\'m very interested in the way my work affects me and my life. Thomas Merton, one of the people I look to for wisdom once said:

When I speak of the contemplative life ... I am talking about a special dimension of inner discipline and experience, a certain integrity and fullness of personal development, which are not compatible with a purely external, alienated, busy-busy existence.

(By \"alienated\" he meant alienated from ourselves.) For me (and only for me -[ I\'m not making any generalizations for others) this being almost always-on connected to the 140 characters-or-less social networks lead me to an external, alienated, busy-busy existence - the opposite of the direction I want to go. ]{style="font-style: normal;"} [So ... I deleted my Four Square account, and I disconnected varied things from my twitter account. I won\'t be using the 1/2 dozen or social media accounts that I have.  I won\'t be tweeting really anymore. I haven\'t deleted my twitter account, so if you DM me, I\'ll still get an SMS telling me. But I won\'t be watching it for the most part.]{style="font-style: normal;"} [I\'ll miss the banter, and the exchange. I\'ll miss the easy answers. I won\'t miss the barrage of info I don\'t need, or the time spent. And, I\'ll still be blogging. Although it likely won\'t be on too many up-to-the-minute news items (like the recent Ning Thing) because I\'ll be paying less attention to those goings on, and more attention to other, deeper things.]{style="font-style: normal;"}

Continue Reading

Betting the Farm

On 16 Apr, 2010 By mpm With 18 Comments

Countless nonprofits flocked to Ning to create social networks. Since I\'m not a social media guru, I\'ve generally kept my opinions about this to myself. But now that Ning isn\'t free anymore, I\'m going to carp some. I think over the course of lo this last few years, I have blogged or tweeted about this very phenomenon what feels like countless times. Nonprofits find services for free. They start depending on them. The free services disappear, for business reasons. The nonprofit community gets up in arms. Lather, rinse, repeat. There is nothing wrong with software or services that don\'t cost anything. Nothing at all. But if you are going to bet the farm, make sure you know what the risks are. Using free services is fine, but know why they are free. Are they free because the company behind them is an ad revenue machine and uber profitable (Google)? Is it free because it\'s open source (Drupal, Elgg, Word Press)?  Is it free because it is a profitable company that has a clear and well defined donation program (Salesforce.com)? Or is it free because it is a start up in search for a business model (Ning)? There is an effort afloat (and a petition) to get Ning to make nonprofit and educational accounts free. I\'m not holding my breath. They eliminated 40% of their staff. They are feeling pinched, and need to stop their burn rate. I don\'t know how charitable this will make them feel. And even if they do, there is no guarantee that Ning will even survive. Anyway, if you\'re looking for a great social network management system that won\'t get pulled out from under you, try Elgg. It\'s open source, and out of the box, it does just about everything Ning does, without the need for the deep setup required to set up Drupal like Ning. It has an active developer community, and is growing. Or, if you look for another free service, make sure you understand the risks, and be prepared for possible disaster if it\'s a startup in search of a business model.

Continue Reading

Social CRM, part 1

On 11 Apr, 2010 By mpm With 2 Comments

This blog series is all Beth Kanter\'s fault. We (the two partners of OpenIssue) shared a cab from the Atlanta airport to the hotel when we arrived for the 2010 Nonprofit Technology Conference. We were chatting with her about what kind of work we do, and she asked \"do you do social CRM?\" She might not have seen the blank stares on our faces since we were in a dark cab, but I\'m sure she heard the pregnant, confused silence. As you know, I don\'t blog much about social media. I use it all the time, but there are much better sources of good information on that - I\'ve been sticking to writing what I know best. But I have to admit, this idea of social CRM piqued my interest. More than that. The truth is, if \@kanter asks me about something that is related to social media, it must be important, so I\'d better figure it out. And, of course, I\'m at least a year behind the curve on this - there has been a lot going on in this space, although, frankly, in my research so far, I haven\'t found a lot in the technology sphere that would immediately be helpful to nonprofits (especially small to medium-sized ones.) There\'s some, and I\'ll talk about that in the next posts in this series. Beth pointed us in the direction of Jeremiah Owyang, who I\'d been reading a little for a while, but had lost track of, since I don\'t follow the social media space carefully. He has a great post on the use cases for Social CRM. It\'s a really solid post, with an information-packed report attached, as well as some resources. This is a bit high level for me - my job in life is generally to make use cases real using technology. I\'m hoping that someone (hint, hint) will write the blog post or report taking off on this work, and articulate the major nonprofit use cases for Social CRM. The report does include some technologies to look at, and I\'ll be delving into those in future posts. I\'m going to take a little chunk off of this, though, and ask some leading questions. And then, I\'ll do my best over the course of the next few weeks to answer how these would get accomplished via the technological tools that most nonprofits use  or can get access to.

  1. How do you know which of your Facebook fans/Causes members are also a donors (separate from donations through Causes)?
  2. How do you know how many of your twitter followers are also donors?
  3. How do you know what percentage of your donors or constituents are on social media at all (twitter, facebook, myspace, linkedin?)
  4. Can you follow the trail from tweet (or facebook status) to a donation? A tweet to a specific action (like a petition?)

If you\'ve got more questions you\'d like to see me address, or you\'ve got some examples of how your nonprofit has answered these questions, please feel free to comment on this post.

Continue Reading

Off to NTC!

On 05 Apr, 2010 By mpm

Tomorrow morning, I\'ll be leaving on a jet plane, to Atlanta, Georgia, for the 2010 Nonprofit Technology Conference. This will be my 7th NTC since 2001 (or, more accurately, my 5th. I went to two Circuit Rider Roundups.) I\'m looking forward to it. I\'m speaking in two sessions: \"Collaborative Problem Solving for Consultants\" organized by Robert Weiner, and \"Earth to Cloud\"  part of the fabulous Tech Track organized by Peter Campbell. I\'m looking forward to the Unconference on Open Data organized by NetSquared, and getting to see lots of old colleagues. I\'ll probably be using FourSquare to check in to places (I\'m still experimenting with that one.)

Continue Reading

{.post-icon .standard}

Why use contributed Salesforce modules for Drupal?

On 24 Mar, 2010 By mpm

Lobo\'s comment on my post yesterday prompted me to complete this blog entry that I\'ve been ruminating on for a while. I wrote a blog entry a while back on the state of Drupal/Salesforce integration. What I didn\'t say is that a number of shops that have done Drupal/SF integration for production sites chose not to use the contributed modules - they built (or are building) their own custom Salesforce/Drupal integration modules. A few months ago, in preparation for a couple of projects, and a big push into this area for our company, I was faced with a strategic choice - go it alone, and build our own integration module for client projects,  or plunge into using and working with the contributed salesforce modules. Truth is, it wasn\'t really a choice for me - I\'ve got using and contributing back to open source projects in my DNA somehow. Although we certainly could have chosen, like others, to go our own way, we have committed ourselves to using, and contributing to the modules on drupal.org. What we lose:

  • Complete control over development process and direction
  • Not having to fix other people\'s bugs in order for stuff to work

What we gain:

  • Not having to reinvent a number of wheels
  • An easier upgrade path
  • Build on the work of others
  • Collaborate and learn

The work done so far on the modules is really solid - and it\'s getting better. There is a great new maintainer, and increasing activity and contributions. There are also a number of other module integrations (like Ubercart, Webform, and FeedAPI) that are moving forward. Integrations with Views and Actions are also moving being considered (it\'s instructive to look at the issues queue). This is stuff that would be hard to match, and makes building integrations for different kinds of sites easier. So beyond just my own personal preference, I think that there is much benefit, both for our clients, and for us as a company, in hitching our wagon to theses contributed modules instead of going it alone.

Continue Reading

{.post-icon .standard}

The easier it looks, the more expensive it will be (or, how to avoid clusterfrack projects)

On 23 Mar, 2010 By mpm With 3 Comments

As most of you know, I\'m a very long time veteran of web application building. I\'ve been involved in web application development basically since they started - when a cgi-bin folder with some perl scripts to process simple forms was the norm. Until just a few years ago, there was very little sophistication about the user experience in web applications - what mattered most was functionality. and to make sure there weren\'t too many errors when users did unexpected things. I\'ve considered myself pretty successful at both helping clients navigate the tough waters of web development projects, as well as accomplishing web projects for them. Recently, though, I had two projects that ended up, for wont of a better term, clusterfracks. And I\'ve spent a lot of time lately trying to figure out what lessons I need to learn from those projects - what can I take away from them so I don\'t make the same mistakes again. They were both custom web applications, both projects that I was a strategic, rather than engineering, partner on. Both projects were attempting to accomplish pretty sophisticated database functionality (such as case management). Functionality I knew how to get done, because I\'d accomplished it before - so I had a very good feeling for what kind of code it would take to accomplish the task (and, ergo cost and time.) But what I hadn\'t taken into consideration is how slick, AJAXy, easy to navigate, and easy to understand user interfaces people have gotten used to in the last few years. And, frankly, have come to expect. And how unwilling people are to sacrifice that for raw functionality. I did a lot of self-examination: where did I go wrong? What could I have done differently? Was it the client? The developers? Me? I realized a fairly simple truth. It was all three.  In reality, I should have looked at the budgets of those projects, and looked at the clients straight in the eye and said, \"double, or triple the budget at least, or don\'t do the project.\" And walked away if they insisted. The vendors should have bid triple what they did, and had more user interface expertise on board. The clients should not have expected to get slick 2009 functionality for a mid 5-figure budget. The easier a user interface is to use, the more money and time it took to create. It\'s that simple. What most nonprofit decision makers don\'t completely realize is that the interfaces they work in every day when they shop,  or tweet and facebook, or use other online tools, are the product of millions and millions of dollars of venture capital, or, in some cases, hundreds of thousands of person hours of work in open source projects (or some combination of both.) Ground-up custom applications, even when written in great frameworks like Ruby on Rails or CakePHP, which save all sorts of development time, just are not going to have the user experience people are getting more and more used to without very serious investment of time and expertise. In addition, most small development shops don\'t have the user interface expertise on hand to accomplish that task, even with a hefty budget. So the lessons: 1) If you are embarking on a custom development project (such as a case management, for example) exhaust every possible option of using and customizing/modifying existing tools (Salesforce, CiviCRM, SugarCRM, other open source tools) before you begin to consider custom development from scratch. 2) If you have a budget of less than \$100,000, go back, and stay, at step 1. I know this is high, but I\'m serious. Obviously, simpler projects won\'t need a budget of this sort. But simpler projects generally don\'t need custom databases. 3) If you\'ve got the cash to spend, and have exhausted all other options, when choosing a vendor, make sure the vendor you choose has UE expertise on hand. Look at other custom database work they\'ve done. Dig in. Make sure it has the ease of user experience that you are expecting. 4) Remember the mantra: the easier it is to use, the more expensive it is to build.

Continue Reading

Drupal 7

On 16 Mar, 2010 By mpm

I\'ve been doing a bit of playing around with Drupal 7 in my copious spare time (not a whole lot of that!) I\'ve also been keeping track, a bit of how the development process is going, and what things will look like. One thing to say - it feels like as big an improvement as Drupal 6 was to Drupal 5. Of course, mostly, Drupal is only as good as it\'s contributed modules (that\'s a bit more of a stretch, now, because many of the key contributed modules, like CCK, are now in core Drupal.) So when folks like us, who build sites that depend on Drupal 7 can start using it is a bit up in the air, although there is a movement to get many modules ready for Drupal 7 at it\'s release. But some may well not make it. We\'re guessing that we\'ll start building production sites in Drupal 7 starting in late summer, early fall, depending on requirements. A note: the standard process for deprecation of old Drupal versions is that when a new version of core comes out, the one two versions back stops being officially supported. So Drupal 5 will no longer get security updates and the like. Already, many module developers have stopped supporting versions of their modules that work on Drupal 5. (The salesforce module maintainers recently made that decision, as have others.) So certainly a site running Drupal 5 won\'t stop working, but it will become vulnerable without security updates to core or modules, and it will get increasingly difficult to maintain and add features to. So it might be a good idea to budget the time and money to upgrade as soon as possible if you are on Drupal 5. If you are on Drupal 6, you\'ve got a while yet, but Drupal 7 certainly has some great advantages, particularly in user experience, to look at.

Continue Reading

{.post-icon .standard}

On 07 Mar, 2010 By mpm

The reason I post these is because 1) I think they might be helpful resources, and 2) you can get a feeling for what I\'m working on, or thinking about (or wishing for.) For instance, the reason there are so many links about Amazon is that we are now beginning a project that uses amazon in earnest, with some others possibly on the way.

Continue Reading

Drupal Commerce

On 17 Feb, 2010 By mpm With 1 Comments

Although it\'s not often used in nonprofit settings, the Drupal module (or, more correctly, a large suite of modules) called \"Ubercart\" is a pretty amazing tool if you need to create a shopping cart system. We\'ve implemented it for organizations that want to sell fees for events, sell items, and take donations. It doesn\'t have many of the strengths of CiviCRM, but it has a lot of useful features if you want to sell things, or combine selling things with taking donations, memberships and selling event tickets. A while back, I\'d heard of the Ubercore initiative - a group of developers working to bring Ubercart to Drupal 7 (there was quite a delay between the release of Drupal 6 and the availability of Ubercart for Drupal 6.)  That initiative is now called \"Drupal Commerce. (other site here.)\" It is basically meant to be a rewrite of Ubercart for Drupal 7. It looks to be something to watch. Gregory Heller of CivicActions wrote an interesting conceptual piece on the integration of Drupal Commerce and CiviCRM that\'s worth a read. (By the way, there is a module done by DharmaTech that integrates CiviCRM and the current Ubercart.)

Continue Reading

{.post-icon .standard}

On 04 Feb, 2010 By mpm

Continue Reading

Beth Kanter\'s Birthday

On 10 Jan, 2010 By mpm With 1 Comments

You don\'t hear me talking much about social media. One of the reasons is that there are a number of really good bloggers out there who know the field far better than I\'ll ever be able to. I would argue that Beth Kanter is the best social media blogger there is in the nonprofit space. Her careful writing about the strengths and weaknesses of social media, the in-depth knowledge of strategy and approach, her on-the-cutting-edge understanding of trends and issues and how they relate to the work that nonprofits are doing has been an incredibly important resource to the sector. If someone asks me about social media, I just say \"ask Beth!\" And, it\'s her birthday! A big bunch of us are blogging today to talk about what she\'s contributed, and also to let people know about the charity she\'s been working with for a while, now, the Sharing Foundation. She\'d like folks to donate in honor of her birthday. Beth, thanks for the expertise and intelligence you\'ve lent to the nonprofit sector for quite some time now! And thanks in advance for all the great upcoming posts in 2010 and beyond that will help me sound intelligent when I talk about social media. :-)

Continue Reading

Drupal and Salesforce

On 31 Dec, 2009 By mpm With 10 Comments

It\'s taken me a while to write this blog post, mostly because I have been working hard at various things (like building a business and building new websites.) This is the last installment in my CRM/CMS integration series, that started almost a year ago (wow!) And I\'m skipping Joomla/Salesforce Integration because there isn\'t any publicly available documentation or code about the integration that PICnet did with Joomla and Salesforce, called J!Salesforce.  [update: see Ryan\'s comment below.]** So what is the state of Drupal/Salesforce Integration? It\'s not as mature as the Plone/Salesforce integration, for sure, but it is coming along nicely. There are several contributed modules:

  • salesforce - main module, with API, node, and user integration possibilities. This module provides the basic salesforce API connection (via SOAP), and includes field mapping, and basic import/export
  • sf_webform - Makes integration with webforms in Drupal fairly easy. Web-to-lead is quite nice and flexible with this module.
  • uc_salesforce - Provides integration with ubercart orders
  • parser-salesforce - Integration with FeedAPI - pulling data from salesforce into drupal nodes via FeedAPI  (I hope to start maintaining this module)
  • sf_import - Import Salesforce objects into Drupal nodes (will be folded into the main salesforce module)

All of these modules are in alpha or beta, although I know for a fact that some of them (or versions of them) are working in production sites. There are a fair number of bugs that need to be fixed before there is a stable release. There are a bunch of outstanding issues that need a lot of work (like caching, for instance). There are two other modules that are related, but don\'t use the main salesforce api module - one for ubercart, and one for web-to-lead (called salesforcewebform). That module has a stable release, but only provides the ability to integrate between Webforms and leads, not other objects. Right now, the salesforce module allows for integration of contact, lead and campaign objects only. so that\'s another big area that could use some work. There is a good screencast done by one of the folks (Jeff Miccolis from Development Seed) who has worked a lot on this project. I\'d say that in a year, we\'ll have a good solid module release, providing lots of features for integration between Drupal and Salesforce.com.

Continue Reading

Got Research?

On 07 Dec, 2009 By mpm

One of the great things about the nonprofit technology field is the collection of nonprofit organizations that provide what is often called \"Intermediary\" services to other nonprofits: information and resources that help nonprofit organizations do the work they do in the world,  by helping them make good technology decisions. I\'ve been involved in one way or another with a number of these intermediary organizations. One of them, Idealware, is an organization whose goal is to provide the sector with unbiased, analytically developed reviews and information about software that nonprofits use in their everyday work. This is incredibly important stuff, and it\'s darned hard work - I know, I\'ve been involved in doing a bit of research for Idealware. If we don\'t have this sort of research in our sector, nonprofits won\'t have the kind of analytical approach to software available - it is much needed. As you might imagine, funding this sort of work doesn\'t come easy - they need our help to be able to continue to provide great research.

Continue Reading

More symptoms of bigger problems

On 10 Nov, 2009 By mpm With 1 Comments

NovSummits1{.alignright .size-medium .wp-image-586 width="300" height="219"} \'nuff said. NovSummits2{.alignright .size-medium .wp-image-585 width="300" height="239"}NovSummits3{.alignright .size-medium .wp-image-584 width="300" height="166"}

Continue Reading

Same crap, different day

On 09 Nov, 2009 By mpm With 3 Comments

I\'m warning you - this is snarky. I was only vaguely following the brou-ha-ha over Causes leaving Myspace. Only vaguely because I don\'t really keep close track of the goings on in the Social Networking space: it\'s not my passion. I use them a lot, both for work as well as for personal use. I know they are becoming an increasingly important tool for nonprofits in communicating with their constituents, and so I do keep them in my peripheral vision, for sure. Anyway, in reading the varied reactions to this news, I had to just sigh, and then get annoyed. Sigh because of what feels to me to be the wasted energy that the nonprofit sector has spent over many years, using, hawking, and supporting proprietary tools and companies. Annoyed because it seems the nptech community hasn\'t figured this out, even being hit over the head with this over, and over, and over again. Make no mistake about it - Causes is a for profit company, and they are making what is, I\'d bet, a decision based entirely on economics. If you\'ve read any of the gloomy news from Silicon Valley, this is just the beginning. Social ventures will not be immune to the blowing winds of economic distress. If we keep building our nonprofit toolsets on proprietary software and for-profit web services, even if they are free (for now) we are going to be bit by this over and over again. The only way we\'re going to get out of this cycle is to channel this energy and resources into open software (including \"open\" source apps for proprietary web services), open standards, and open networks - things no one can take away. I love to write blog entries about successful open source efforts - like CiviCRM, or the amazing stuff people are doing in the mobile space. Writing blog entries about for-profit web vendors that make economic decisions that hurt nonprofits because we depend on them too much is just not fun.

Continue Reading

Open Mobile Camp report

On 25 Oct, 2009 By mpm With 3 Comments

Yesterday, I spent the day in Manhattan, at the UNICEF building, with a bunch of folks passionate about the technology in mobile phones, and the ways to use that technology for good. I\'ve been a very long time cell phone user (had one since 1998), but I haven\'t been involved in implementing a mobile system for an organization, so I had a lot to learn. The place to find reports on what happend is on the wiki. Also, check out the twitter stream for the #omc09 hashtag. I was especially interested in the issue of mobile data collection. (I was so interested, I facilitated a session.) And, even more specifically, I\'m interested in how to leverage CiviCRM and mobile devices for a range of interesting applications. There are a number of ways to get data from mobile phones into a CRM - and all have advantages and disadvantages, depending on a lot of things.

  • Globally, what you can basically depend on is SMS. Smartphones haven\'t made it into most of the developing world, nor have 3G networks. So how do you get SMS data into a database system like CiviCRM? You need an SMS gateway, and systems such as RapidSMS to gather data
  • Use J2ME to write applications for mobile phones, and send the data via SMS to a central database.
  • A tool such as EpiCollect, which is an Android app.
  • A slimmed-down, simplified webform to be used on mobile browsers.

One thing that would facilitate this would be a more robust API system in CiviCRM - access to the data via REST or JSON, which would allow CiviCRM to talk with some of the tools out there like Mesh4X. I learned a ton. Thanks to MobileActive.org and the Open Mobile Consortium for a fabulous event.

Continue Reading

On 14 Oct, 2009 By mpm

These are the last 10 sites I bookmarked on delicious.com:

Continue Reading

{.post-icon .standard}

Security and Privacy in a Web 2.0 world

On 08 Oct, 2009 By mpm With 2 Comments

[caption id=\"attachment_569\" align=\"alignright\" width=\"300\" caption=\"Security Camera - Photo by Sirius Rust\"]Security Camera -
Photo by Sirius
Rust{.size-medium .wp-image-569 width="300" height="199"}[/caption] Beth threw down the gauntlet, and I had to pick it up. I\'m sort of surprised I hadn\'t written about this before. I think a lot about both of these, not so much for myself, but for organizations that I work with whose work is fairly sensitive. First off, some definitions - I think that these two terms do get mixed up quite often, and understanding what\'s really being meant by them in a technical context is important. Security, in this context, is the concept that your personal computing resources and data are safe from both prying eyes, as well as hijack by crackers and spammers who will use those resources and data for their nefarious ends. In the case of your computing resources and personal data inside that box you call your laptop, or protecting the whole of your home or office network, security is a matter of using specific tools that prevent unprivileged outsiders from getting in. Wifi passwords, firewalls, password protected fileshares, virus protection software, etc. are the tools of the trade here. Security of your private data that is \"in the cloud\" is largely at the mercy of the software developers who hold your data. Luckily, most of them take security quite seriously. (That said, your data \"in the cloud\" can be compromised by lack of security on your network or laptop - someone installs a key logger, for instance, and grabs all of your passwords.) Privacy, in this context, is that you can control, in a granular sense, what information about you is exposed to whom. Privacy is, as Beth says, primarily a matter of human behavior, but there are very interesting intersections with technology and security. In some instances, services have default privacy settings that are a lot less private than someone might like - and it takes some know-how to figure out how to correct those settings. Privacy is, also, a set of decisions that get made - sometimes in haste, or without much consideration. Your drunken decision to post that picture of you (or a co-worker) dancing in your underwear on a table at a party, the cat is out of the bag, and may never be able to be put back. Security and privacy in the context of online communities, as Beth points out, are different beasts. The software that drives online communities (such as Drupal, phpBB, and others) have options to allow for varied levels of security. You might need to have a password to see anything. Or you might just need a password to make comments. You might not be able to just register for an account - you might need to go through an admin. These days, most software driving communities have roles you can assign people to, with specific privileges granted per role. But privacy is made up of policy (the policy of the organization running the community) as well as the behavior of the members - their collective agreement that \"what happens in Vegas, stays in Vegas.\"

Continue Reading

Data Ecosystems

On 30 Sep, 2009 By mpm With 2 Comments

Not so long ago, nonprofit organizations had software tools, that dealt with specific parts of their organizational process. They had fundraising tools, client management tools, volunteer management tools, HR tools, accounting tools, etc. And the data in these varied tools were siloed - there was no way for one tool to talk to another without:

  1. painstaking manual entry
  2. painstaking export/import processes
  3. tools written by the same vendor designed to talk to each other (which meant that they were generally exceedingly expensive)

Although many nonprofit organizations still find themselves in this situation, there are increasing numbers of tools available to help them out of it. And as more and more organizational processes become web-based (whether \"in the cloud\" or self-hosted), and as more and more nonprofit-focused software includes open APIs (with some unfortunate exceptions,) nonprofit data is looking less and less siloed, and more and more like an ecosystem - many different software parts talking to others. NTEN is trying to get a bit of a handle on this with the Data Ecosystem Survey. I\'m very much looking forward to the result - looking to see where this new set of tools that can talk freely to each other is working ... and where it isn\'t - where there is still work to be done. Please take time to fill it out!

Continue Reading

{.post-icon .standard}

Evaluation and being a learning organization

On 12 Sep, 2009 By mpm With 3 Comments

Beth Kanter tweeted about an article by Gale Berkowitz relating to evaluation, which I found really fascinating - it is a must read. In this article, Gale points to an interesting shift within her organization (the Packard Foundation):

\"Over the past four years we have been shifting from evaluation for proof or accountability ("Did the program work?") to evaluation for program improvement ("What did we learn that can help us make the program better?").\"

In some ways, it\'s a subtle shift - but as she says, the latter leads to \"real time\" evaluation - something that happens as one moves through projects, not just at the end. Nonprofit organizations often have their feet to the fire to evaluate their programs and projects, because funders and contributors often demand proof that their programs work. And there has been an overall movement in the sector in the direction of increased evaluation and learning.  The community I\'m a part of, the group of for-profit (\"for-little-profit\" as is often said - we\'re small and lean)  companies that serve the technical needs of nonprofits, evaluation is generally not part of the process of the work we do. But it should be. I\'ve talked about this before. A lot. In a variety of different contexts. To me, evaluation, both internal (\"how could we have done this process better?\" \"\"How could we have worked together as a team better?\") as well as externally with the client (\"How did we do?\" \"What could we have done better?\" \"How could we have communicated better?\") is a critical part of the work we do. It\'s a tough balance. We\'re geeks, often busy deep in the command-line, SQL and code. We\'re often extremely busy, juggling lots of projects and demands at once. The bottom line, of course, for us, is always a measure of how well we are doing, but I don\'t think that\'s enough. As our sector as a whole moves further and further along the path of a commitment to evaluation and learning, I think it behooves us to follow. So, you ask, what are good strategies to start with? I can give you what we try to do. Some of it is well worked out, and some is nascent. All of these we aim to do, but it\'s easy to miss the target. Evaluation is a learning process, like anything else, and the most important thing is an intention and commitment to being a learning organization. The rest will eventually follow over time.

  1. Spend time at the beginning of each project outlining evaluation steps and process for the project.
  2. Spend time at the end of every project asking internally \"what worked, and what didn\'t work?\"
  3. Ask clients at the end of the project a set of questions about the process and the result.
  4. If its an ongoing engagement, ask periodically (we aim for every 6 months or so) for an evaluation meeting or call with the client.
  5. Write a report at the end of each project with lessons learned.
  6. When a proposal isn\'t accepted, ask a few questions, both internally and externally, and write up a short report with lessons learned.
  7. Ask internally how earlier lessons learned, are being applied to current projects.
  8. Always be open to learning how to make things better.

Continue Reading

On 11 Sep, 2009 By mpm

Some great tech and nonprofit tech stuff I\'ve come across lately:

Continue Reading

Specify Story, not details

On 19 Aug, 2009 By mpm With 4 Comments

I\'ve been a fan of user stories for several years now. User stories are a way to describe a set of functionalities of an application in a way that is focused on results - it\'s easy to connect to mission. An example from an events management application:

The organization should be able to create several different kinds of events, and determine for each kind of event which detailed information will be taken. Those events can be displayed in a list or calendar format. Users can register for events, and pay using a credit card.

There are many ways to describe this story - it certainly can be a lot more detailed, but what\'s clear is the result of this functionality. And, of course, user stories are great for agile development process. Developers would determine how much this function would cost (based on our knowledge of the tools we use, and the time it takes using those tools to generate this sort of functionality), and clients would know exactly what they are getting from a functionality standpoint. When this functionality is complete, everyone is happy. The developers get reasonable compensation for a job well done, and the clients get the mission-based functionality they asked for. And it would avoid a situation which I have become recently far too familiar with - vendors who underbid projects, and then feel the need to resort not to the intent of the contract, but the letter. Everyone knows it is utterly impossible to specify every detail in the letter of a contract - sometimes letter of the contract, unfortunately, details things like fields and queries, not functionality. The letter of a contract will be, almost by definition unless based on functionality, an inadequate representation of the final result needed. In this case, no one really wins. The clients either don\'t get the functionality they expected, or they pay extra for it, and they leave the project with a bad taste in their mouth about the vendors, which will only come around to hurt the vendors later.

Continue Reading

Diversity and Open Source

On 01 Aug, 2009 By mpm With 11 Comments

The python community has started a conversation about diversity, with the ultimate goal of creating basically a welcoming statement. It comes out of Kirrily Robert\'s keynote at OSCON about women and open source. There is a cool site from the Ruby community called Railsbridge, and one of their guidelines is to \"Reach out to individuals and groups who are underrepresented in the community.\" There has been, of course, a lot said about the fact that although women make up 20% of the tech field, they only are approximately 1.5% of open source communities. There have been long standing groups that have tried to address this, and new efforts as well. Some open source communities are more diverse than others. In her keynote, Kirrily talks about two open source projects, Archive of Our Own and Dreamwidth that have a majority of women involved, which is rather unusual. A short twitter conversation I had with a colleague brought up the issue of whether or not this is just an exercise - will this actually lead to any lasting change? That\'s a good question. Kirrily has a set of really good guidelines for open source communities:

  • Recruit diversity
  • Say it, mean it
  • Tools (Tools are easy)
  • Transparency
  • Don\'t Stare
  • Value all contributions
  • Call people on their crap
  • Pay Attention

As a long time open source user and advocate, even though I am someone who rarely finds people like me in open source projects (i.e other women of color), I\'ve always seen the open source movement a potential avenue for the greater involvement of people other than white, straight, young men, because theoretically (this is the important part) one\'s involvement in a community is pure meritocracy. But so many open source communities have so far to go when it comes to being welcoming. I\'m reminded of sitting in Drupalcon in DC and hearing Dries talk about the \"beard length\" of the developers. And of course there was the huge brou-ha-ha around a presentation at a recent Ruby conference. And, of course, there are other factors as well. There are far too few places like The Community Software Lab of Lowell, MA, who\'s mission is:

We write, administer and maintain open source software to serve the underserved. We use and improve the skills of people with underused skills We work to make hacker sub culture values (transparency, meritocracy and generosity) the values of the entire culture and bring about the post scarcity society. We work toward our mission by trying to achieve our short term goals transparently and generously while accumulating only necessary wealth.

So what will it take? Will this effort in the python community pan out? I think it\'s a great start. I think the first step is definitely a focus on community environment. Is it friendly? Is it welcoming? Is it easy for new developers to start, and get deeper in? Are there good mentoring models? All of that makes a huge difference. And having a statement doesn\'t at all guarantee anything, but it provides something people can point to and say \"this is our goal.\" Better than nothing, and a lot better than many open source communities are doing.

Continue Reading

The wonders of libcloud

On 30 Jul, 2009 By mpm With 2 Comments

Here at OpenIssue, we think a lot about the web. I mean, a LOT. And we\'ve been thinking a lot about web hosting, and the varied flavors it comes in. We\'re working to figure out what makes sense for us to use and implement, and what makes sense for us to recommend to our clients. A while ago, we decided, like many folks, virtual private servers were going to be the preferred hosting set up. Not that it\'s right for all organizations - but for many who invest significant dollars into implementation of a website or CiviCRM, the advantages of a VPS will likely outweigh the higher monthly cost. We started using Slicehost, which was incredibly easy to set up and use, and was acquired by Rackspace, which is considered the premium dedicated server hosting company. I then soon discovered a service called Cloudkick, which allowed us to monitor all of our slices and our clients slices in one dashboard. That was very cool. It turns out that in the process of creating Cloudkick, the folks there came up with libcloud - a library that service providers could use to give developers access to the services needed by the servers - list, restart, create, destroy, etc. There are now a number of cloud hosting service providers, such as Rackspace cloud servers (used to be Mosso), Slicehost, and Amazon, that are beginning to support libcloud. Libcloud has become it\'s own open source project, and is under active development. Hopefully, this will provide a plethora of options for folks in terms of being able to monitor and manage the varied cloud servers they\'ve got going. It certainly has already made our lives a lot easier.

Continue Reading

Tidbits

On 03 Jul, 2009 By mpm

Here\'s a broad ranging list of interesting tidbits I\'ve found recently.

Continue Reading

{.post-icon .standard}

Newly discovered project management tool: Redmine

On 02 Jul, 2009 By mpm With 7 Comments

Any consulting shop that does significant amounts of implementation and development (as we do) needs a project management and ticketing tool. Basecamp seems to be a standard that many people have reached for. We were using Intervals for a while, which is really a fabulous tool if you do a lot of hourly consulting. We also have been using Google spreadsheets for some elements of project management. All tools have their strengths and weaknesses. And, in addition, the best tool does nothing without good human project management skills using it. As a shop that practices Agile development (we use an adaptation of scrum methodology that seems to work for a shop that does multiple projects with small teams,) finding a good tool that facilitates instead of hobbles Agile was critical for us. We found, and have chosen to use Redmine for our project management/ticketing system. You can think of it as a multi-project version of Trac, which is a fabulous ticketing/wiki system that we were initially going to go with. Redmine has the elements of Trac that we liked, with the added ability to track multiple projects. Like Basecamp, Redmine has document storage and messaging systems. It doesn\'t have milestones per se, but it does allow you to see tasks in calendar and Gantt views, which is very helpful. Unlike Basecamp, you can add custom fields to tickets, users and other features. Having spent many hours in Basecamp, I actually like Redmine much better. It does even do time tracking, which we won\'t use, but is nice to know is there. And the wiki is nice. Basecamp\'s Writeboards seem so much more like an add on than integrated. It\'s a Ruby on Rails application, and that was actually kind of fun to finally get to install and play with RoR a tiny bit. And it\'s great that it\'s free and open source. Although that wasn\'t an absolute requirement for us, it is most definitely a plus, given so much of our work is implementing open source web tools. And it\'s nice to save a few bucks per month.

Continue Reading

{.post-icon .standard}

Why we\'re not friends anymore: the nptech echo chamber

On 07 Jun, 2009 By mpm With 8 Comments

I did a kind of radical experiment a couple of weeks ago: I de-friended almost all of my nptech and client Facebook friends (cutting my friend count by more than 60%). I had a few reasons for this, and over the past couple of weeks that I\'ve been living this experiment, it\'s made me quite happy. Of course, everyone is still on Twitter, and Linked in, etc., so I still feel connected. Even though I tend not to blog anywhere near as much as most of my colleagues about social networks (because it\'s really not my passion,) I\'ve been a fairly early adopter, in the broad sense (of course, if I compare myself to Beth Kanter, I\'m a laggard.) I have an account on all of the major social networks (and some of the obscure ones, too,) listen often, and update fairly regularly. A while ago, I realized that I would keep hearing the same nonprofit technology related stuff, over and over again, and I realized I was contributing to that by using Ping.fm to send the same status notices everywhere, or connecting my twitter account to my facebook and linked in accounts, etc. (actually, I think it might even be possible to create an infinite loop doing that stuff.) I stopped doing that a while back. Now of course it used to be that all of my Facebook \"friends\" were other nptech early adopters. But around two years ago, a steady stream of my real friends started to come on, and then about 40% of my Facebook friends were non-nptech related. I noticed two important things: first, a status notice that a real friend was having a hard time would get buried in the cacophany of new reports, new campaigns, new blog posts, etc. Not a good thing. Also, I noticed that I censored myself on Facebook - I wouldn\'t say things to friends, or play games, or take silly quizzes because I felt the need to be \"professional.\" So all of that lead me to make Facebook a \"work-free\" space. I left work-related groups, disconnected this blog from Facebook, etc. And doing that led me to think a little bit about how we nonprofit technology leaders use these social networks, and how we work with our clients to use these services. I do think that still, the majority of nonprofit organizations aren\'t all that connected to social networks. I\'m not entirely utterly convinced yet that all of them should. And I do wonder about the echo effect - if you are an early adopter, and you are on multiple networks, you are going to hear the same stuff over and over. Is that a good thing, or a bad thing? Should we be suggesting that organizations tailor much more specifically their messages, rather than using the services that allow them (and us) to send the same updates everywhere at once? The technology behind social network strategy and implementation is way more my bad than communications strategy, but this experiment has opened my eyes to some of the things we may be doing wrong. And, of course, there is an entirely interesting conversation to be had about the issues of work and personal life, but I\'ll save that for my other blog.

Continue Reading

Avoiding Trainwrecks

On 03 Jun, 2009 By mpm

I spent a big chunk of my day dealing with a project that is, in no uncertain terms, a trainwreck. The client has sunk a ton of money into a product which is in, its current (first phase supposedly finished) state, unusable (client and vendor shall remain unnamed.) My role in the project has been strategic and as a liason, not technical, which to some extent gives me a bit of a distanced view. Web development trainwrecks are, sadly, far from isolated cases - they happen all the time, even when all of the parties have good intentions. And as someone who is building a business around doing this sort of work, it is of keen interest to me as to why some projects end up in the state that this project is in, and I want to make sure to avoid these kinds of situations. So how do we avoid trainwrecks? Some trainwrecks we can see coming miles away, but yet we are in complete denial about them. Some trainwrecks are like sudden derailments - it\'s not at all clear where it comes from. But I think all trainwreck projects have the seed of the wreck somewhere in the history of the project. The hallmarks of this particular trainwreck were so clear, that in retrospect, they scream out at me:

  • Lack of transparency about development process
  • Lack of transparency about cost implications of increased scope
  • Waterfall development process (well, the vendor said they practiced Agile, but in practice, it\'s been waterfall)

As a practitioner of the Agile development process (we use a somewhat modified form of Scrum, in particular,) I\'m beginning to really see the value of this kind of process. It makes visible all sorts of things that are often hidden. It seems like the Agile methodology helps in a number of ways:

  • Once educated, clients have a window into the development process. They know what small chunks of development are going to happen in a given time interval, and they know what they will get at the end of that time interval
  • Things are developed in priority order
  • Clients can critique things early
  • New functionality becomes a part of the \"product backlog\" and it is easier to have clarity about what is and is not within scope

Of course, it is theoretically possible to be completely transparent in a traditional waterfall methodology, and completely opaque using Agile, but I do think that the Agile methodology makes it way more difficult to be opaque. But it also takes some work and education of clients unfamiliar with the methodology (as well as making mistakes along the way on our part as developers.) And I\'ve been able to watch this process work well, not only with our own projects, but also with a project I was a strategic lead on. I was pretty skeptical a year or so ago, but now I\'m sold. And since transparency has always been something of real importance to me, a development process that encourages transparency is a good thing.

Continue Reading

On 02 Jun, 2009 By mpm

As you can tell, I haven\'t had much time to blog lately. Here are some great links I\'ve come across that I thought were worth sharing:

Continue Reading

Be like a three year-old

On 03 May, 2009 By mpm With 1 Comments

I don\'t have kids, but I do know how young kids ask questions. They are innocent, and free of assumptions, and keep asking \"why?\" In the end, the poor adults either get tired of the questions, or realize that there are assumptions they\'ve been making for all this time that might actually be worth questioning. Human processes mold around software. We see this all the time. A CRM gives you these 5 canned reports, and you get used to making do with what\'s there. A legacy client database requires a certain order of data entry, and your intake forms have been produced to copy that order. Your email software has particular limitations, and you find behavioral workarounds. What\'s also true in the realm of customized software, is that software is molded around people. You put in your RFP that a package spit out data in X,Y and Z ways because your ED is used to data in that form (maybe because a package they had at their previous organization had those canned reports.) You have a requirement that data be entered into the system in one particular way, probably because that\'s the way you\'ve always done it. Sometimes, you feel the need to replicate a process that the person 3 administrative assistants ago put in place that was molded around their particular limitations, just because that\'s what you know. When you are undergoing the process of creating or implementing a new system of any sort, whether it be a CMS for a website, a CRM, some internal system, it is a really good exercise to be like a 3 year-old, and keep asking \"why?\" Why do we need this feature? Why will this report be important? Why should the software work this way? Once you peel the layers down to the bottom, you\'ll either have \"we don\'t know\" or \"because we believe it will help us meet our mission in this specific way.\"  Then you know what you should take, and what you can leave behind.

Continue Reading

CiviCRM Developer Camp

On 03 May, 2009 By mpm

I got to spend one day at CiviCRM developer camp this week. Unfortunately, it came after 4 long days of conferencing, after many exhausting days of work, so I wasn\'t at my peak. But I learned a lot, and thought I\'d share some of what I took away from that day. First, the core team shared some of the new stuff coming out in version 2.3, and it is awe-some. One of the major reasons CiviCRM gets dinged as a CRM/DMS is that it doesn\'t have reports. Well, that problem is about to go away with the release of CiviReport in 2.3. There will be a number of canned reports, and some really nice ways to create reports. Plus charts! Yay! There were some pie charts, and regular bar charts. I don\'t have the new svn trunk of CiviCRM installed, otherwise, I\'d show some screenshots, but it looked really nice. (I\'ll be installing CiviCRM from svn in the next week, and I\'ll probably blog more as 2.3 develops.) There are some really nice usability improvements coming up in 2.3 as well - to make the basic contact pages much easier to navigate. And there is a new menu system, which will make things a lot easier. And, for Drupal users, some sweet Views 2 and CCK integration. CiviEvent is getting waiting lists, registration approval, and user-modifiable registrations, and some other improvements. The Alpha of 2.3 should be out by July. I also learned about CiviCase, which is actually present in 2.2. I saw the example of it used for the Physician Health Program in Canada. It\'s quite good, and there are some useful docs to see it at work on the CiviCRM wiki. I\'d love to find an organization, such as a small human services organization, in need of case management software, that could use CiviCase - it would be a great, and relatively inexpensive alternative to current offerings out there. And more organizations using CiviCRM for case management would help CiviCase get even better. I also dug into some of the internals and code of CiviCRM, and feel better equipped to start contributing more than ideas and feedback to the project.

Continue Reading

Where I\'ll be at NTC

On 22 Apr, 2009 By mpm With 2 Comments

NTC is coming, and I don\'t have to pack! That\'s a good thing. But I will be BARTing my way into SF everyday, from Saturday for Penguin Day, Sunday through Tuesday for NTC, and Wed and Thursday for CiviCRM Developer Camp. I\'m very much looking forward to all of it, even though it seems like it\'s going to be an exhausting 6 days. I\'d love to meet new folks and see as many old friends as possible, so I figured I\'d share where I\'ll be during these days, and perhaps we can meet up. You can email me, \@pearbear on twitter, or give me a text message or call ... if you know my cell, that is. :-)

Continue Reading

{.post-icon .standard}

Why you should care that Oracle is buying Sun

On 20 Apr, 2009 By mpm With 2 Comments

In general, the activities of the big tech corporations have somewhat limited and indirect effect on nonprofit technology. For large enterprises, the activities of the big players is a much more immediate and important set of issues to deal with. For us, it\'s generally much more removed. However, today\'s news that Oracle is going to buy Sun Microsystems has some very important implications. Why? It has to do with the fact that many, many nonprofit websites and web applications are built using MySQL, the most popular open source database management system. Sun bought MySQL AB (the company behind MySQL) last year for \$1 Billion dollars, and therefore, MySQL AB now becomes a part of Oracle, it\'s primary competition. There is some suspicion that there may be anti-trust challenges because of this, but if it goes through, it raises some huge questions about what happens to MySQL because of this. Of course, since MySQL is open source, there is no danger of MySQL going away, someone can always fork it. And, ultimately there is a great open source database alternative called PostgreSQL, but support for it is not universal. However, the future of ongoing support and development for MySQL is certainly in question. Most nonprofits don\'t get any support from MySQL AB directly, but larger organizations that might have been getting some support might see changes down the road. It\'s something that those of us who depend on MySQL for our web development projects will be watching quite closely.

Continue Reading

Blog shout outs

On 15 Apr, 2009 By mpm With 3 Comments

On the right is my blogroll, that needs updating, but I thought I\'d do some shout outs to blogs I\'ve lately been loving and really learning a lot from, who are probably not on that list (yet).

  • Wireframes Magazine - I\'ve been doing Information Architecture for a very long time, now, but it\'s great to learn new tricks and tools.
  • Flowing Data - OK, I\'ll fess up, I\'m a data geek. And I love data visualizations, and ways to make data easily accessible. I am so envious of people with graphics skills who can do that well. There are a whole lot of really cool things here.
  • RoughType by Nicholas Carr - really smart dude, really interesting stuff.
  • ONE/Blog - ONE/Northwest never ceases to amaze me
  • The Open Road - Matt Asay has some interesting insights from the Open Source biz world

Continue Reading

{.post-icon .standard}

CRM & CMS integration: Web pages and forms

On 15 Apr, 2009 By mpm With 2 Comments

Third to last in my series on CMS and CRM integration (next up, Joomla and Salesforce, followed by Drupal and Salesforce) is using web forms. I wanted to talk about this because it is arguably the most common form of \"integration\" between CRM and CMS that\'s out there (besides the manual kind). You\'ve got a CMS, and you\'ve got a CRM somewhere else, and you need some way for data from users to make it to your CRM. Of course, it\'s not really integration - there is no sharing of data between the CMS and the CRM in any useful way. But webforms can really help you get things done. Here are some examples of things I\'ve done and seen done:

  • A custom donation page that\'s sitting on a service like Network for Good that is linked from the website, or framed within it
  • The HTML for a \"Web to Lead\" form from Salesforce.com pasted into a CMS page
  • The HTML for a event registration form or donation form that goes to a hosted service

In the first option, the form isn\'t hosted at all on your site. In this option you have the least control over look and feel - the vendor controls the look and the behavior. An example of this I\'ve run into is when an organization uses Blackbaud\'s Raiser\'s Edge, and wants to have online donations via NetSolutions, their older (and much cheaper) \"integration\" tool. They provide a page, which hooks directly into the users RE installation. But you can\'t customize the page in any useful way, so if you\'ve just designed a brand-spanking new site, this page is gonna look like crap. (Luckily, at least Network For Good\'s donation pages look snappy and nice, but are going to look a lot different than your website.) The other options are much better for look and feel - you can take the HTML, and, in most instances, style it to look like your site. You can even sometimes include Javascript for validation or other functionality. But this is still strictly one-way communication - the form data goes directly to the service (and does not pass go.) You don\'t get any of it. This is a great start to integration, if your budget doesn\'t allow for true, deep, two-way integration between CRM and CMS. And it\'s a great way to get your feet wet in thinking about what you might want to do with CRM and CMS. And, in some instances, depending on both CRM and CMS, it might be your only option.

Continue Reading

Penguin day comes again

On 09 Apr, 2009 By mpm With 1 Comments

I love Penguin Day. One of my favorite days of the year. Always comes right around NTC. This year, it\'s before NTC, on Saturday, April 25. It\'s a day dedicated to conversation and community around nonprofits and open source software. There\'s some great stuff on the Agenda, like:

  • Introduction to Free and Open Source Software
  • Fundraising with all free software
  • Free And Open Source Online Advocacy: Tools And Best Practices
  • Making sense of Free and Open Source Content Management Systems
  • Introduction to Blogging with Wordpress
  • Intro and Advanced sessions on Joomla! and Drupal
  • CiviCRM vs Salesforce.com: What Are the Differences?
  • Mobile Volunteering: The ExtraOrdinaries Project
  • Creative Commons And Open Content
  • And many more...

You can register at Penguinday.org. Thanks to the generosity of Google, we\'re delighted to grant fee waivers to anyone who needs one! I look forward to seeing folks there.

Continue Reading

{.post-icon .standard}

Drupal security, and other CMS Report comments

On 03 Apr, 2009 By mpm With 8 Comments

Now that the Idealware CMS report is out, I get to have my say about it. Here\'s the first post, there might be more to come. The thing that is prompting this post is the little storm about the security metric that we used to try and get a handle on the security of the 4 different systems we reviewed. More on that in a bit. You might think that comparing four different open source packages that, in essence, do pretty the same thing (in a broad sense) would be a cakc walk. In fact, nothing could be farther from the truth. The developers of each project have completely different sets of assumptions about what the right way to do things is, and completely different philosophies and ethos when it comes to building interfaces and functionality. Making apples-to-apples comparisons of these systems was one of the most difficult analytical tasks I\'ve taken on in a while (and, actually much of the heavy lifting of designing the analysis was done by Laura Quinn), and until you attempt such a thing, please be somewhat tempered in your complaints about it. Now the security issue. One of the 12 different aspects we are comparing is \"Scalability and security\". The report isn\'t about security, it\'s a very, very broad comparison of the systems, with security as a very small component. That\'s just the context. Two (yes, just two) questions out of many relate to security.  First, a simple metric relating to security reports, and second, what processes are in place in the communities to deal with security. This wasn\'t designed to be an in-dept, complex analysis of security. If it had been, we would have done a lot more work on how to measure security. On the Four Kitchens blog, they say, \"While both reports above seem to identify Drupal (and Joomla! and WordPress, to be fair) as having notably bad [emphasis mine] security, they\'re also both based on one superficial metric: self-reported vulnerabilities.\" Now I can\'t speak about the IBM report (I haven\'t even read it yet), but our report says no such thing. Drupal gets a \"Solid\" on Scalability and security. Solid, which is only one step below Excellent. And you know why it got a \"Solid\"? Because, indeed, it does have more reported security vulnerabilities than Plone (as do Joomla and WordPress.) David Geilhufe, who also takes issue with the security metric, has some good points. Yes, sheer numbers of vulnerabilities are not anywhere near the best metric of whether or not a system is secure or not. As a quick comparative look between a small number of open source systems, it\'s hard to argue that it contributes no information. Four Kitchens seems to suggest that part of the reason for more vulnerabilities in Drupal compared to Plone is that it\'s more popular. But, if you\'ve been an observer to the Linux/Windows FUD wars, you\'ll remember that Microsoft has that exact same argument about why there are more security vulnerabilities in Windows as compared to Linux. And the Linux folks say, in response, \"It\'s not popularity, it\'s design.\" I\'m sure  that Four Kitchens, and most open source software developers agree with that perspective. In reviewing Plone, and talking with people who develop for Plone, I was convinced that the reason that Plone had fewer reported vulnerabilities was not just because it was less popular - it\'s because it (and Python and Zope) was more secure by design. I am completely happy with Drupal\'s security (otherwise, it wouldn\'t have gotten a \"Solid.\") I think the Drupal community takes security extremely seriously, and if they didn\'t, I wouldn\'t have chosen it as a platform for development. I also think that the Joomla and WordPress communities take security seriously. In our estimation, they were all really good. But Plone was just that much better.

Continue Reading

{.post-icon .standard}

New kid on the block: BlackbaudNow

On 02 Apr, 2009 By mpm

Blackbaud announced, just in time for AFP, their new product, called BlackbaudNow, in partnership with PayPal. It is a curious service. It is an extremely low-end, low-cost online website/online donation package from a vendor that spends most of its time on the very high-end of the scale. It is simple. An organization can sign up for a free account, get a 5 page website, including a donation page, about page, etc. Editing a page is basically point and click - it highlights the part of the page you can edit it, and you edit it with a WYSIWYG editor. It\'s decently AJAXy, but no, it\'s not shiny - at least not my definition of shiny. You have a small number of templates to choose from (which, frankly, aren\'t so great looking - I think they dedicated more graphic design time to their branding and pages than they did to the templates.) It\'s free, although Blackbaud takes a percentage off the top. People can donate to your organization via Paypal only, and you can track donations in their very simple interface. You can export your donation history into a CSV file, and you can make your reports into PDFs. There are no APIs. This was developed by the team that Blackbaud acquired when they acquired eTapestry. And, it\'s designed to make migration to eTapestry easy - therein, I suspect, is the key. I\'m betting this is a loss-leader - a product designed to get people in the door, and when they are chomping at the bit for more (which they will be in about 2 days after they set up their site,) there is a more costly (and profitable) product waiting right around the bend for them. Small nonprofits - especially those with few or no staff, are always in a particularly challenging place when it comes to finding the best solution for a web presence and online donations. But I don\'t think that a tool like this is going to serve very many nonprofits for very long, given its limitations. Of course, people like me, who make our living building websites, and helping facilitate the web presences of organizations, look askance at tools like this, so take what I say with a grain of salt. But I have to admit that this seems to me a bit too much like a gateway drug - get them hooked on free, then move them slowly but surely to much more expensive systems. And in the end, won\'t a modest investment  (say, \$2K or so) on the part of an organization in getting a better web presence going to serve them better in the long run? Heck, I think a Wordpress.com site attached to a Network for Good donation page will serve them better. At least they\'ll have a lot more well-designed templates to choose from, and a real CMS engine. Honestly, I\'m underwhelmed by this service, and, in addition, I have a bone to pick with Blackbaud. The online help for BlackbaudNow is powered by the open source software MediaWiki. It is well hidden, but a somewhat savvy MediaWiki user will notice the telltale signs (the URLs are one giveaway.) Of course, proprietary software makers use open source software all the time, that\'s not the problem. The problem I have is that they hid it. Why hide the fact that they are using an open source tool to build their online documentation? Not even a small mention on the About page. Did they do any modification to the code to make it work like they wanted to? Did they contribute anything back to the MediaWiki community? At the very least, they could have given credit where credit is due.

Continue Reading

Exciting changes afoot...

On 01 Apr, 2009 By mpm With 1 Comments

I have some exciting news. For the last few months, I have been working on a new collaboration called OpenIssue, which is a growing, diverse, self-reflective and constantly-learning team. We are focused on delivering quality web technology solutions to nonprofit organizations and social enterprises. As you know, I have built a long-time expertise in open source software and web applications, particularly Content Management Systems (CMS) and online database systems, including CRM. Thomas Groden, my new business partner, has expertise in Software-as-a-Service Constituent Relationship Management Systems (CRM), as well as much more broad expertise in technology infrastructure. All technology implementors have to choose their tools (unless they run a very large shop) and we have decided to focus on implementation of both Salesforce.com and CiviCRM as CRMs, and Drupal as a CMS. We are keenly interested in building on our expertise to integrate these open platforms in really rich ways, to allow organizations to create great online applications. I\'m excited to be a part of a team - I\'ve been a soloist for a while, and it\'s nice to build collaborations, and work together with people with shared ideals on larger projects than I\'d be able to take on alone. And I\'m really excited by the set of technologies we\'re working on, and the kinds of applications we\'ll be building with these technologies. And you can follow us on twitter.

Continue Reading

{.post-icon .standard}

CRM & CMS Integration: Plone and Salesforce.com

On 16 Mar, 2009 By mpm With 4 Comments

Today, I was reading up on what the Plone community has done with integrating their CMS with Salesforce.com. I am thinking that this might be a good model for how we can do it with Drupal, but that\'s a subject for another post. {.alignnone width="200" height="194"}(from Plone/SF Integration group) There\'s a good overview of the integration on the developerforce wiki. There are 5 components to the integration:

  • a couple of toolkits that provide the basic back-and-forth between Plone and Salesforce.com (they talk to Python and Zope)
  • an auth plug-in that allows for Salesforce.com objects to be Plone users, credential checking, caching of user data, and syncing of data from Salesforce.com and Plone
  • an integration of PloneFormGen with Salesforce.com for web-to-lead forms, etc.
  • an event management product that connects with Salesforce.com
  • A PayPal integration product

This is a pretty robust set of channels for data to move back and forth from Salesforce.com to Plone. There is a Plone/Salesforce.com Integration group, that keeps working on this, and a number of organization, including ONE/Northwest, have invested huge amounts of time and resources to working on this integration. This is, for sure, one of the most robust open source CMS to CRM integrations out there, and one that seems to be getting pretty close to providing very powerful integration \"out-of-the-box\" - instead of having to piece things together and do customized code, which is more common than not. I haven\'t gotten my hands on this to try (not being a Plone person, I doubt I will), but folks might want to talk in comments about how straightforward the integration is, given differences in data for different instances of Salesforce.com. I don\'t know how much code tweaking is required to really get this going. But in any event, it\'s great that it exists, and it\'s a great benchmark for CMS/CRM integration.

Continue Reading

Salesforce and CiviCRM

On 11 Mar, 2009 By mpm With 7 Comments

This morning, I looked at both Salesforce.com, with the second nonprofit template, and CiviCRM with a small group of colleagues. All of us implement, or have used, one or both of the systems. But each of us has expertise in only one of the systems.(I\'m one of the CiviCRM folks). It\'s pretty interesting to compare them. The nonprofit template has certainly helped to make it easier for nonprofits to do the brain surgery required to use a for-profit sales tool for nonprofit CRM purposes. Salesforce.com is, of course, much more sleek and polished. And the power behind the application is pretty unassailable. And, there is a huge ecosystem of add-ons available for Salesforce.com that doesn\'t exist yet for CiviCRM. But there are significant modifications, both in the way nonprofits think about data, as well as the way data is manipulated, that have to take place in order for organizations to use Salesforce.com. CiviCRM is really intuitive for organizations to use out of the box. Donation pages, and event registration are built in to CiviCRM, but have to be added into Salesforce.com. It\'s way easier to create relationships in CiviCRM - you can create any kinds of relationships you want. Can create groups and smart groups easily in CiviCRM. This is harder in Salesforce.com, and smart groups don\'t exist in Salesforce.com. Anyway, there\'s lots more, and you\'ll be hearing lots more about both of these tools from me in the coming months.

Continue Reading

DrupalconDC Final Report

On 10 Mar, 2009 By mpm

It\'s been a few days since I got back from Drupalcon, and I\'ve had time to let all of the things that happened settle in. It was a great time, and I\'m really happy I went. We had a fabulous (and quite large) nptech/progressive exchange/community organizing BoF. There was a show-and-tell session for nonprofit websites (which I didn\'t make it to). I went to some interesting sessions on Ubercart, Organic Groups, and a BoF on Drupal in churches (where I wondered about the theological spectrum, and guessed was populated mostly by evangelicals.)  I met lots of great people, and saw old and new friends. I think, also, I\'ve completely drunk the Drupal koolaid. I\'m psyched to be working with Drupal more intensely (I\'ve got 4 Drupal projects going at the present moment.) There\'s lots of new things to learn, and challenges to face, but I\'m excited about digging in a lot deeper. I\'m sure I\'ll have more to say as time goes on. And I\'m looking for good excuses to go to Paris for Drupalcon Paris! There were lots of great talks, and the videos are up!

Continue Reading

DrupalconDC Report #1

On 04 Mar, 2009 By mpm With 1 Comments

At the end of day one, I figured I\'d give a little report on how DrupalconDC is going for me. I\'m having a good time, and learning a lot. I went to three pretty intro talks (two of them were a bit too intro for me, but I got a few good tips) and one advanced panel.

  • Themers Toolkit- I\'ve only done a few themes, and modified a few, but I guess that was enough for this panel to be too beginner for me. But I did learn a few tricks I didn\'t know, so it was useful. It was a good talk.
  • Totally Rocking Your Development Environment - also covered mostly stuff I knew, but I did learn a few tips (and also learned a little from my next door neighbor. It was a great talk by an incredibly enthusiastic speaker. I can\'t believe though, that she suggested using Makefiles for Drupal!
  • Organic Groups - since I haven\'t personally implemented OG, I didn\'t know a lot about the innards, and how it really works. It was a great introduction, and I\'m totally sold on it. He gave some sweet examples of it\'s use (like teamsugar.com) which is amazing, and made me totally rethink using Elgg.
  • Advanced Theming Techniques -  A nice talk given by two folks from CivicActions, once of whom I\'d worked with jointly on a client project. I learned a fair bit, and now have some good techniques to think about using as I start doing more serious theming (although, truthfully, I\'d like to eventually be able to hand that off to folks who have a better visual sense than I.)

Dries\' keynote was fun, and it was great to hear a bit about the history, and also the ideas about where Drupal is going. One thing he said in particular stood out: \"Start thinking of the internet as one big machine.\" The idea is that as barriers to the movement of data come down, doing things that were never possible before become a lot more possible. And there is some really cool stuff coming in the future like OAuth, Job Queues, RDFa output, XMPP, and Activity logs. Really neat. I missed a couple of talks I\'d wish I could have made, like the Drupal SEO talk, and Install Profiles. I was glad to see that there are a significant number of women here, and a number of women presenters, too. I hung out with Drupalchix for lunch, and met new folks, saw some colleagues, and generally have been having a good time. I\'m very much looking forward to tomorrow.

Continue Reading

{.post-icon .standard}

What I\'ll be writing about (and not writing about)

On 04 Mar, 2009 By mpm With 1 Comments

I get a lot of email all the time from people hawking various wares, fundraising ideas, new ways to use Web 2.0, or this and that. I\'m sure that this post probably won\'t make much of a dent, since I suspect that at least 70% of the people who send me stuff (I get 6-10 emails a week that fall into this category) have never read this blog (even though they might say they love it.) I realized, in getting this stuff, and trying to figure out what to do with it, that I needed to be better at understanding myself what I was doing, and articulating that clearly. Once I get this blog post done, I can clean out that \"To Blog\" inbox. In the realm of things in my life that this blog is meant to cover, I have two passions: data, and moving data around, and open source software. Of course, I talk about both of these things in the context both of nonprofit technology and the sector in general, as well as in my role in the sector as a provider of ways and strategies to use the latter to handle the former. It really is these things I want to focus most on. I\'ll always be talking about CMS and CRM, and increasingly the integration between them. I\'ll always be talking about open source software, particularly as it relates to web applications, but more generally as well. I\'ll probably be talking a lot about Drupal. I\'ll also be talking a fair bit about SaaS CRMs in the coming months, for reasons that will come apparent relatively soon. I\'ll always talk about what it\'s like to do the work that I do, and talk more about how I do it. I still will like to throw in the occasional post about Web 2.0, particularly as it relates to moving data around. And you\'ll always get a bit of shiny from me. And, you will get the occasional promo post about something the organizations that I am on the boards of (Aspiration and NTEN) are doing. I won\'t blog about fundraising or communications strategies, or campaigns, or skittles. I won\'t talk much about communications, except as it relates to data, or open source. I won\'t explain how or why to use twitter or facebook, unless you are trying to put a twitter stream into a Drupal page (moving data, open source.) Although I will talk about what kind of data you should keep and move, and why. I won\'t post information about what nonprofit has adopted what shiny software product, unless it\'s my client, and it is to illustrate a specific point, or it\'s a case study (and I generally don\'t use my client\'s names.) So, if you are part of that 30% that reads this blog and sends me stuff to post, you have an idea of whether or not I\'ll use it.

Continue Reading

{.post-icon .standard}

New leap for open source CMS vendor

On 26 Feb, 2009 By mpm With 1 Comments

Mpower Open, the vendor who took their high-end CRM/DMS product, MPX, open source last year, has adopted a new name, Orange Leap. They have also released two new products, called Orange Leap and the Guru. The combination of Orange Leap and The Guru are a web-based CRM/DMS and reporting system aimed squarely at Salesforce.com and Convio Common Ground. The pricing of the hosted version is definitely competitive. Orange Leap is possibly going for what is now often called the \"Open Core\" business model, although it\'s not entirely clear. Their new products (as well as MPX) have \"community editions\"  - mostly they lack services and support, which makes sense. But Orange Leap Community Edition also lacks \"Domain specific fields and rules\" and \"Outbound Enhancements, Business Rules, and Processes\". It\'s not actually clear what those are. There is mention of a \"community portal\" but it\'s not evident anywhere I can find. You need to request a demo of their open source tool, instead of the standard practice, which is creating an open, public demo for everyone to see and play with. On their brand new, quite lovely (and orange) website, as a developer, there is no place to find the software, interact with others, or find a way in. There is no community that is at all visible. I like very much what these folks are trying so hard to do - provide high-quality, high-end open source applications for the CRM/DMS space. But I\'m afraid they are going to be squeezed to a pulp between the behemoths of salesforce.com and Convio, on one hand, and the strong, vibrant open source community of CiviCRM on the other. Their only way out is to build an equally strong, vibrant community of developers and implementors - and that will be an uphill battle.

Continue Reading

{.post-icon .standard}

CRM&CMS Integration: Blackbaud Raiser\'s Edge and NetCommunity

On 13 Feb, 2009 By mpm With 27 Comments

What? She\'s talking about Blackbaud? Yes, it might be surprising, but I got a friendly email from fellow NTEN Board Member Steve McLaughlin, who also happens to be head of all things internet (more formally, Director, Internet Solutions) at Blackbaud. He gave me a demo and overview of their NetCommunity tool, which has been around for a while, and I figured it deserved a blog entry. It is, in fact, a great example of integration of a CMS and a CRM. Originally, I wasn\'t going to cover the one vendor solutions, like this because, I believed (and, honestly, I still do) that you\'re not going to get as powerful a CMS as you can as the best-in-breed CMS tools. However, it is true that Raiser\'s Edge, the CRM/DMS tool that this integrates with, is inarguably one of the most important tools out there. Some call it the gold-standard. For many other CRM/DMS vendors, it\'s the red spot at the center of the dartboard in their office. The demo was pretty cool. But you know me, I fall for shiny, especially when it comes to data. The integration between the web front end and the RE back end is bi-directional and sweet. There were a lot of things you could do, including accept donations, track personal donation pages, and the like. and a lot of different ways to track what your donors and constituents did, both online and off, and have those show up in really interesting ways. It is, in many ways, the kind of CRM/CMS integration that lots of organizations want and need. Organizations can get this package in three different ways: On premises - installed inside the firewall, hosted, or SaaS. Their SaaS offering is called \"NC Grow\", which provides sets of fairly simple CMS templates to start with, designed for organizations that, in their words, \"are ready to reap the benefits of richer online marketing and communications, but may not have the resources or expertise in place to make such a website come to life\" The big kicker, pretty much as always with Blackbaud, is the price tag. There is a \$10K license fee that you have to pay if you use the On premise or hosted versions. Expect a \$35-45K price tag for development and integration. Their SaaS offering, NC Grow has a \$20K/year price tag. This all is, of course, above and beyond the megabucks you\'re already paying for Rasier\'s Edge. I didn\'t get a very close look at the CMS (I\'m wishing in retrospect that I had), but the little bit I did see of it suggested to me that it was somewhat more limited than CMS systems such as Drupal or Plone. Even if, perchance, it\'s not, you still don\'t get the vibrant community of developers making cool modules and add-ons to do just about anything you can imagine - you\'ll have to either wait for Blackbaud to do it, or, perhaps (I\'m not even sure if this is possible, but correct me if I\'m wrong in comments) have someone custom develop special custom features for you. And, you\'ll have an automatic \$10K price tag tacked on that you won\'t pay with the open source tools. I have a hard time believing that that translates to \$10K worth of feature value (one could argue it\'s \$10K worth of integration value, though, but I\'m not sure about that.) Bottom line: If you are an organization which has Raiser\'s Edge, and is committed to keeping it, and you want to do sophisticated integration between it and a web front end, then NetCommunity is probably your best solution. But before you jump in, make sure that the CMS is going to have the sophistication and power you need. And know that because RE doesn\'t have open APIs, you are unlikely to be able to create the kind of sophisticated integrations with a different CMS that NetCommunity provides with RE. But, if you are not a RE user, or are considering migrating off of RE, I don\'t think that the combination of RE and NC is especially cost-effective. You can get this level of integration with Drupal/CiviCRM for sure, and likely Plone/Salesforce, and Drupal/Salesforce (with a bit more work.) More on those later.

Continue Reading

CiviCRM and Drupal (& Joomla)

On 26 Jan, 2009 By mpm With 3 Comments

CiviCRM was the first nonprofit-focused open source CRM (one of only two, at this moment.) It is a great tool for small to medium-sized organizations who are looking for a CRM to track members and donations, register people for events, and do mass mailings. There are also some other features, like grants management and case management that are more nascent, but promising for the future. I\'ve implemented CiviCRM together with Drupal, and I\'m really psyched to keep working with this great combo. CiviCRM originally only integrated with Drupal, but recently a lot of work has been done to also integrate CiviCRM with Joomla. CiviCRM acts in Drupal like a module, and in Joomla like a component. This means that since the code sits in the same exact place,  and the databases could even be shared (or not, it\'s your choice) in effect, CiviCRM is becoming part of your CMS. You install CiviCRM inside your CMS installation directory, and the CMS and CiviCRM talk to each other through PHP APIs (or \"hooks\") (there are some examples of database calls across the CMS/CRM). There isn\'t much work to be done by you, or by the person who implements it for you, unless you want to do customizations, and expose CRM data in new and interesting ways. Users in your CMS installation will become users in CiviCRM when you install it, and be synched going forward. You can set up web forms (for donations, event registration, etc.) and have them be menu items. It\'s a very straightforward integration. Using CiviCRM and Drupal is a great way to easily get powerful integration between your CMS and your CRM. They are both installable on pretty standard shared-hosting accounts (although shell access really helps.) It\'s a really cost-effective way to get powerful features. The disadvantage of this is that you have to choose CiviCRM and Drupal (note on Joomla below).  Both have their disadvantages, and you might have a variety of reasons for not choosing one of them. Jon Stahl, in his comments on my first post in this series said: \"a PHP API accessible only to other PHP apps on the same machine is simply not sufficient integration in an age of web services, where people run different apps on different machines and use languages other than PHP for building web apps.\" I do agree with Jon to some extent. I do know that CiviCRM has been working on their web services APIs, but a really strong set of them would mean that people could integrate CiviCRM with more CMS, which , from my perspective, would be a Really Good Thing. That said, I do think that this combo would be a cost-effective solution for good chunks of the nonprofit community. Note: My understanding is that there are still some snags with the CiviCRM/Joomla integration, and I\'m not very familiar with it. If you already have a Joomla site (or you are about to choose Joomla) and you want to use CiviCRM, you should talk to the CiviCRM-Joomla folks, or check out the CiviCRM forums. One example: since Joomla doesn\'t have granular ACLs (Access Control Lists) there must be issues with how permissions work in terms of access to specific parts of CiviCRM. If you have detailed info, please feel free to share in the comments.

Continue Reading

{.post-icon .standard}

Integration of CMS and CRM - Preamble

On 26 Jan, 2009 By mpm With 3 Comments

As I talked about in my last post, there are a variety of strategies one can use to move data between your CMS and your CRM. I\'m going to choose a few examples to look at in some depth. Some of these are examples I\'ve been working with clients on, or I\'ve played with, some are just examples I know about, but are prominent, useful examples to talk about. I\'ll talk a bit about mechanics, and talk about strengths and weaknesses, and under what situations you might want to look at it. I\'ll cover:

  • CiviCRM/Drupal (with Joomla notes)
  • Plone/Salesforce.com
  • Using varied webforms (like DemocracyInAction, Blackbaud, Network for Good, etc.)
  • Drupal/Salesforce.com
  • Joomla/Salesforce.com

You\'ll notice that only Drupal, Joomla and Plone are represented among CMS. That\'s mostly because that\'s what I know, and there is a critical enough mass for all three of them that some integration work has been done in a systemic way (the exception to this is Drupal/Salesforce - it\'s only half-way systematic.)  I  haven\'t included any all-in-one systems (like Kintera), mostly because I don\'t think they are a good idea - you might get a halfway decent CRM, but you\'ll for sure get a crappy CMS, and there is no good reason for that. Another note: I\'ll talk about this in detail later, but Salesforce also includes the new Salesforce.com app, Common Ground, by Convio. From what I can tell (I\'m learning a lot more fairly quickly) integrating Common Ground with a CMS should be pretty much the same process as integrating Salesforce.com. First up, CiviCRM/Drupal. I\'m choosing this first because it is a pretty interesting example, and also is an example of what I would call easy and tight integration.

Continue Reading

Looking forward to NTC 2009

On 14 Jan, 2009 By mpm With 3 Comments

I love NTC (NTEN\'s Nonprofit Technology Conference). I would be dishonest if I said I didn\'t have a sweet reminiscence for the Circiut Rider Roundups of old. But they are long gone. As fields often do, ours grew up and professionalized. And what has taken it\'s place is valuable to a much wider audience (and a much larger one!) And, this year, for the very first time, I live in the same city in which NTC is taking place. Hurrah! So, a few things to say about what I\'m looking forward to from April 25th to April 30th:

  • April 25: Penguin Day SF! It\'s happening the day before NTC this year, not the day after. Gather with folks and spend an exciting day peer-sharing about free and open source software in nonprofit organizations. Any level of background in the topic is welcome, and everyone learns.
  • April 26-28: NTC. Another jam packed year full of great panels and expertise sharing. I\'ll be involved in two panels this year. (And lots of conversations on the side.)
  • April 29-30: Hopefully, there will be a CiviCRM developer camp. Yay! I\'ve been using CiviCRM for a year or so, and have begun to get involved in implementation. Looking forward to digging deeper in.

And email me if you want to have coffee, or lunch, or a side conversation in the Science Fair. And, you can help folks get to NTC!

{width="0" height="0"}

Continue Reading

Integration of CRM and CMS

On 14 Jan, 2009 By mpm With 17 Comments

If there are two acronyms that are at the center of nonprofit communications, it\'s these two, CRM (Constituent Relationship Management - and I\'m making this broad enough to include fundraising) and CMS (Content Management System). And because of this, it makes sense that integration of these two is something that is a need to be filled. What\'s involved in this? First, the what - what to integrate? Most often nonprofits want to capture information from web users. That sort of information could be a newsletter sign up, a contact form that should be responded to, an online donation or an event registration. The organization wants to capture the demographic details, as well as make sure that data is synchronized with the data they might already have on that web user, so they can track their constituents over time.  In addition, sometimes nonprofits want to expose data from the CRM to the CMS. One purpose is to allow users to modify their own information (if the site allows logins.) Another purpose is for membership lists, or group lists, or perhaps live tracking of donations from a specific campaign. There are four strategies that can be used for this integration:

  • Manual. The old fashioned form of integration. Forms sent go to email, or a separate database, and someone manually enters that data into the CRM. Data from the CRM is manually reported and put up on the CMS. It\'s amazing how much this still needs to be done. Some CRMs still don\'t have openAPIs. Even if they do, it takes developer time to write the code to do the integration, and that may be resources that a nonprofit doesn\'t have.
  • All-in-one. Some vendors have products that provide integration by having both the CRM and the CMS together. Trade-off - you don\'t get best of breed tools for both. Second trade off, all of these are proprietary.
  • Web forms from CRM vendor. This is integration of a sort. One sets up a page (often donation or event page) on the CRM vendors web platform and link your page to their pages. Or, one pastes HTML code for a form into one of the pages of the CMS, and when the user clicks \"submit\" the data actually goes to the CRM vendor.
  • Integration. This is when actual code is written in the CMS (via module or customization) which calls APIs on the CRM side to perform specific actions, such as adding records, syncing records, grabbing data, etc.

All of these strategies take time and resources, but of different kinds. Some take internal staff resources (especially the Manual strategy.) Others take developer resources (especially the Integration strategy.) Depending on the CRM, some require additional license fees for forms or APIs. So what\'s the right strategy? That totally depends on a few factors. First, are you happy with your current CRM and CMS? If so, what specific types of integration do you want to have happen? What are the specific tasks and data types you want to move between the CRM and the CMS? The best way to accomplish those tasks depends primarily on your CRM, although if you are using a proprietary CMS, a hosted CMS service, or an older CMS, you might run into trouble with integration. If you are in the process of reassessing your CRM or CMS or both, now is a very good time to think hard about how you want these two to talk with each other. I know that\'s one more thing in a long list of considerations (and it\'s generally more important to think about for the CRM - the CMS, if it is modern, and especially if it is open source, will provide few barriers to integration.) I\'ll have follow up posts on specific examples of this integration using open source tools (on one end or the other, or both.)

Continue Reading

Tidbits

On 09 Jan, 2009 By mpm With 1 Comments

Some stuff from my inbox. (A lot of these are 2008 news, therefore, kinda old. But still interesting to me.)

  • Appirio releases their top 10 predictions for cloud computing in 2009. One of the more interesting ones is that \"a major SaaS 1.0 company will fail.\" I kind of wonder about some of the early nonprofit-focused SaaS offerings, and how long they might have to live, given the strength of Salesforce.com
  • Third Sector New England, a Boston-based nonprofit capacity-building organization launched a series of \"FAQ\" videos for nonprofits. Useful stuff.
  • NARAL Pro-choice America launched an innovative ad campaign. Very neat stuff, and a great use of video and YouTube.

Continue Reading

The Dangers of Online Services

On 07 Jan, 2009 By mpm With 1 Comments

This week was a bad week for online blogging services. First the blogging service JournalSpace, with hundreds of users, just, well, died, because they didn\'t have a proper backup. Today, the hacking of the  blogging service SoapBlox, which was used by many progressive political bloggers, such as Pam\'s House Blend, became known, and it is currently unclear how many sites have survived, and what will happen to them. These are two fairly small, fairly low-profile services (although SoapBlox is considered an extremely important part of the progressive blogosphere.) They hosted a small percentage of the blogs out there (in comparison to, say, TypePad or Blogger.) However, this is, of course, devastating to those who had their blogs there. Lessons to learn:

  • Always have your own backup of your data/content
  • Remember when setting up a website or blog that if you use a service, the data is not in your hands, but in someone elses
  • Always have a disaster recovery plan

Continue Reading

Top 10 blog posts of 2008

On 26 Dec, 2008 By mpm With 1 Comments

Here\'s the top 10 list for 2008: 1) Remember when 1 MB was alot? I wrote this post back in 2005, and it is the most popular in 2008! It\'s actually because someone included it in a Wikipedia Article (no, it wasn\'t me.) 2) Carnival of Nonprofit Consultants on July 27th. Don\'t know why this rose to the top, but the carnivals are fun to do. 3) No More Custom CMS. Where I rail against web shops that continue to suggest that people use their CMS, when it\'s just not possible for one shop to replicate the robustness, features, security and upgradeability of the Open Source CMS offerings. 4) Blackbaud Buys Kintera. The proprietary consolidation of the CRM/Donation management system space continues apace. 5) The Search for Good Web Conferencing. An exploration of options with my own particular requirements in mind. 6) Google Analytics vs. Sitemeter. Wow, this post is from 2006. 7) Getting Naked: Being Human and Transparent. This blog entry from 2007 is about being open about one\'s mistakes. I think it\'s the word \"naked\" that does it. It has one of the highest bounce rates of any post on this blog. 8) What is Cloud Computing? I define it, and explore it a bit. 9) Linux Desktops? One of my frank and painful posts on the topic. 10) Cake vs. Symfony Where I explore these two PHP frameworks.

Continue Reading

My Top 16 tools of 2008

On 26 Dec, 2008 By mpm With 6 Comments

These span the range from tools I use every day or every week, to tools use more occasionally, but depend on. They also span the range of proprietary, SaaS, and Open Source. They are on this list because I think they are great, because they have undergone a lot of change or development this year, or because they are game-changing. Open Source Tools 1. WordPress. I use WP pretty much everyday, between my own blogs, and helping clients maintain theirs. WP as a blogging tool rocks my world, and although I certainly could move blogging to Drupal, since I seem to be becoming somewhat of a Drupalista, it\'s just not worth it. WP is clean and easy, and virtually hassle-free. There are lots of really great themes out there, and there just isn\'t a reason I can find not to use it. 2. Drupal.  I\'m somewhat of a latecomer to Drupal. Having been bogged down with my own open source CMS tool before 2005, then having taken a break from development, I missed out on the prime years of Drupal\'s development. But now, here I am, and I\'m impressed. It has become arguably the most popular open source CMS, and is a very able platform for creating all sorts of great web applications. 3. Xen. I use this everyday, although I don\'t really interact with it much. I am administering and/or responsible for a couple of Virtual Private Servers that use it. Virtualization has really come into it\'s own this year, and will continue to be a force to reckon with. I\'m betting that in 2009, many folks will move from shared hosting to VPS servers. There are a lot of good reasons to consider this. 4. Songbird. Songbird is a brillant idea: build a music player using the Mozilla framework. Songbird was a buggy mess just a year ago, but with the recent release of 1.0, it\'s absolutely an application to get to know. 5. CiviCRM. Oh what a difference a year or so makes. CiviCRM continues to mature, and is providing an interesting and important new model for nonprofit software development. It is becoming more popular, and is also highly recommended by those who use it. I\'ve been getting to know it this year, and begun implementing it. I like it more and more. 6. Freemind. This is an awesome cross-platform mind mapping tool. I use it to create sitemaps, mostly, but it\'s also great for brainstorming. 7. Elgg. Elgg is the open source social network management system. Install it on your own server, control your own data. Don\'t use Ning, use Elgg. It finally looks like a project which will allow me to explore the strength of that platform is coming around the bend. Stay tuned. 8. MAMP. Wanna set up a easy development environment on your Macintosh without struggling with Fink or MacPorts? Use MAMP. Easy, fast, robust, and powerful. Being a pragmatist, I do use proprietary tools, both the Software-as-a-Service, or basic desktop tool types. I use these tools because I haven\'t found open source alternatives for these functions that work as well, or are as user friendly. SaaS Tools 9. last.fm. I love last.fm. I love discovering new music, seeing what people I know are listening to, and learning more about what I listen to over time. 10. Twitter. This was the year for twitter. This was the year that nonprofits discovered twitter, and the year I integrated twitter into my workflow. 11. Evernote. I haven\'t yet become an Evernote devotee, but I might. It\'s an online note-saving service, with desktop and iPhone clients. It\'s great to be able to take notes on my iPhone on the fly, and know they are saved, and will show up on my desktop when I want them. And it\'s great to have my notes wherever I go, without bothering to sync my phone. 12. Intervals. Having tried a variety of project management and time tracking tools over the years, from the open source tools like ProjectPier (used to be ActiveCollab) and GnoTime (abysmal), as well as SaaS tools like BaseCamp, I have finally come across what is, for me, the perfect mix of project management, time tracking, and invoicing. It\'s not cheap, but it works well, and saves me so much time invoicing, that it pays for itself several times over every month. Proprietary Tools 13. Adobe Air, and applications. Adobe Air is an impressive framework for rich internet applications. I use TweetDeck, Twhirl, and the Analytics reporting suite among others. 14. Balsamiq. This Adobe Air application deserves its own entry. (I\'ve been meaning to blog about it for a while.) It\'s a really great tool for creating very rapid mockups of sites that you are working on. It actually is good enough as a wireframe tool. 15. Coda. Panic software makes really good stuff. Coda is a great editor for developers. I like it better than Textmate, which I know is another popular editor for developers. 16. VMWare Fusion. Even being the semi-religious Mac and Linux desktop user that I am, every once in a while I am forced to use Windows. This makes it tolerable. There\'s a nice full-screen view, if I want to really feel the pain. There is also a mode called \"unity\" which allows you to run a Windows application in a regular Mac window. It\'s kinda cool. So what tools did you come to depend on in 2008?

Continue Reading

{.post-icon .standard}

Can open source software save organizations money?

On 19 Dec, 2008 By mpm With 4 Comments

Next year, given what is likely to be a grim funding year, nonprofit organizations are going to be hunting for ways to save money on technology. There are, of course, arguments that IT budgets should be, at least, level funded during slim times, but the reality is that organizations are going to reduce budgets across the board. One question that will inevitably be asked: can free and open source software save organizations money? The answer, of course, is a solid maybe, but also a resounding yes. Confusing, huh? Open source software is both free as in \"beer\" as well as free as in \"kittens.\" There are no license fees, but it takes care and feeding. The most important part of the equation is what you are implementing, and whether or not you need to factor in migration costs. Nonprofit organizations that did migrations to open source software from proprietary packages with large license fees during relatively fat economic times are reaping the benefits of that change now, and are in good shape to weather the storm. Organizations that haven\'t been able to do that migration might find those costs to be prohibitive at this time - which is unfortunate. But if you have a migration planned anyway, now is absolutely the time to look at open source software. At this point in the maturity of most open source packages that nonprofits would want to use, the implementation cost is very much in line with the implementation costs of proprietary software. So that means that you are saving money - no cost to acquire, and no long term license or maintenance fees. All of the above adds up to that solid maybe - implementing open source software in your organization might save you money depending on what you are implementing, and what the costs are for migration. Where does the resounding yes come from? This, if any, is the time for organizations to reject the standard \"every organization for themselves\" mentality of software acquisition and development. Find a solid open source package (like CiviCRM, for instance,) and help fund extensions to that software with other organizations that help make it what you need. Find 5 organizations that do similar work, and collaborate to build an open source application that can work for your part of the sector. Release it so a community can develop around it, make sure to make it modular so that it can be easily extended. Make it full of APIs so you can hook other software to it. Build it with open standards so the data is readable in perpetuity. Doing this will mean you will get far more application for the money you spend. Of course, it all takes effort and work. But it\'s worth it - and the entire community benefits by an enriched software ecosystem. It also ends up not just being about saving money. It also ends up being about building community - and community will be an incredibly important asset in the coming years. There is an appropriate popular culture reference: \"live together, die alone.\"

Continue Reading

The Power of Open

On 15 Dec, 2008 By mpm With 3 Comments

[caption id=\"attachment_398\" align=\"alignleft\" width=\"300\" caption=\"Songbird screen\"]Songbird
screen{.size-medium .wp-image-398 width="300" height="214"}[/caption] I\'ve known about Songbird for a long time. It\'s a cross-platform music player based upon the Mozilla framework. I thought it was a brilliant idea years ago, but it was a buggy mess the last time I tried it (about a year ago.) However, Songbird has emerged, like many open source projects do, as a mature, stable, and, in Songbird\'s case, a truly awesome application, because of the incredible extensibility of the Mozilla framework (and the talent of the Songbird developer community.) I\'ve only been running Songbird for about 20 minutes, and already it\'s linked with my last.fm account, is showing me a picture search based on the artist I\'m playing, as well as showing me a list of all of the concerts happening in the Bay Area by artists in my library. I can read reviews, browse videos, and read the lyrics of the song playing. It\'s happily notifying Growl when new songs play. This qualifies as a killer app, and it will give iTunes a run for it\'s money. I don\'t really have a good reason to use iTunes anymore. Between open standards that allow songbird to grab data from all sorts of places, as well as the open architecture of Mozilla, allowing hundreds or thousands of people to write their own cool plug ins that we all benefit from, this really does show the power of open. Next question: can we get the nonprofit version of the killer open source and open platform app?

Continue Reading

{.post-icon .standard}

Digging deeper into the portable social graph

On 06 Dec, 2008 By mpm With 2 Comments

Facebook Connect was announced a few days ago, and, of course, it\'s the talk of the Web 2.0 world. Beth Kanter, as always, has a nice overview of what it is, and what it might mean. Google Friend Connect has been around for a few months, but they just opened it up to everyone last week. What do these two toolsets mean? Are they truly open, and based on open standards? Just a quick definition: the \"social graph\" is, basically, your data about who you are, and who is connected to you - who your friends are. A portable social graph would be one that you can take with you, wherever you are - so the friends that are connected with you on one network are also connected with you on another. It\'s the holy grail of social network connectivity - you are connected to who you are connected to, no matter what site you are on. Google Friend Connect is a toolset based on three standards, two of which are open, one of which could probably be considered an open standard, but it originated with Google: OpenID, OAuth and OpenSocial. Any social network that can use these three standards can be drawn into the open social network web using Google Friend Connect. Any user on any of the social networks that use these standards can connect with their friends on others that use these standards. Facebook connect, on the other hand is a proprietary process that competes with OpenID, and is only a two way communication between other sites and Facebook - it\'s not at all open. And, if you are not on Facebook, that other sites use Facebook Connect won\'t matter to you. (For instance, it won\'t help connect LinkedIn with MySpace.) Facebook Connect is not the portable social graph we\'ve all been hoping for - Google Friend Connect is a bit closer to it. Both Google and Facebook are interested in being the repository for your credential and social graph data. However, the fact that Google uses the open standard OpenID means that you can actually control where that data lives - and that is not the case for Facebook. What is most annoying to me is that Facebook Connect is proprietary, and it competes with an open standard, OpenID. They could have just as easily implemented the open standards - but they chose to go in a different direction. For most of the social networks except for Facebook, the walls of the gardens are coming tumbling down. But Facebook is basically just enlarging their walled garden. What does this mean for most nonprofit organizations: not a whole lot. This is going to take a long time to shake out, and only the most Web2.0 savvy nonprofits are going to be doing technology projects that will involve implementing either of these new toolsets.

Continue Reading

We want video!

On 05 Dec, 2008 By mpm With 2 Comments

YouTube is everywhere - you see videos as a common part of websites, and almost everyone has an internet connection with high enough bandwidth to play video. This means that a lot of nonprofits are interested in having video on their sites. So what does it take, and what considerations should you think about as you embark on adding video to your site? First, it is almost always a mistake to upload a video to your website without thinking about the ramifications, both in terms of bandwidth, as well as performance. If you have a standard hosting account, or even a VPS (Virtual Private Server) do some back-of-the-envelope calculations to make sure you won\'t end up with sticker shock at the end of the month. Video is very bandwidth intensive. It is not at all difficult to overshoot your bandwidth limitations on your hosting account with one short video on your home page. A client of mine put a short video on their home page after election day, and we had to take it down a week later, or else they would have started to have to pay for extra bandwidth. Take your average traffic for the page you\'ll add the video on, and multiply by the size of the video. For instance, if you have a 3MB video, and you get 1,000 visits per day on that page, that\'s potentially using 3,000 MB (3 GB) of bandwidth (of course, most people won\'t play through the entire video, etc. but that\'s the place to start.) And 3 GB of bandwidth for a month will exceed the bandwidth limits of many virtual hosting plans. In terms of performance, lots of people streaming a video from your website can bring a webserver to its knees. If that video is more popular than you expected, you may end up paying for it, both literally and figuratively. What about putting it somewhere else? YouTube is the easy answer. Google pays the hosting costs, you get easily embeddable video that can be viral, and you can drive traffic from YouTube to your site. But what if it\'s not a public video (perhaps you want to provide video for your members only, for instance) or you want to stream live, or use a different format than flash? There are a number of services you can pay for. StreamGuys and Limelight Networks are two examples of companies that can provide that sort of service for you. Putting video on your website takes both strategic thinking (why are we doing this? What are the goals?) as well as tactical, technical thinking (what\'s the best way to get this video to the eyeballs that we want to see it?)

Continue Reading

{.post-icon .standard}

How\'s that donor database of yours?

On 02 Dec, 2008 By mpm

In general, although I am sometimes asked, I tend to avoid assisting clients with choosing a donor database package. Mostly because, although I actually know the field pretty well, it\'s at the 10,000 foot level, rather than the 50 ft level that clients really need. And I know there are plenty of folks out there who know the field really well at 50 ft, and can step in with the best advice. As a 10,000 footer, NTEN\'s new Donor Management System Survey is of keen interest. There is, of necessity, a lot of overlap betwen CRM systems and Donor Management Systems. Many of the CRMs also show up here, although there are quite a number of packages that did not show up in the earlier survey. In some ways, it is astonishing how many different donor management packages there are. In most ways, however, this is far from a surprise - donor management is a primary way that money gets funneled into nonprofits, and, unsurprisingly, organizations often spend significant dollars on their donor management packages. By far the most popular DMS of the ones surveyed was ... you guessed ... Blackbaud\'s Raiser\'s Edge. 18% of users surveyed use that one, which also accounted for 35.5% of use in very large organizations. Others I think about: CiviCRM had 4.8%, Organizer\'s Database at 3%,  Salesforce was at 2.6%, Democracy in Action at 0.6% and MPower at 0.4%. I  also have to wonder (shudder) how many home grown Access and Filemaker databases fall into the \"Other\" category of the survey, almost 20% of the total. So how did people like these? They ranked the percentage of folks who would recommend a package. In a three way tie for first included two proprietary packages I\'d never heard of: NEON CRM and Donor Pro. In that trio was Organizer\'s Database, the desktop open source DMS. 4th (since there was a 3 way tie) was CiviCRM. Included in the bottom four are 3 properties of Blackbaud: Raiser\'s Edge, eTapestry, and Kintera Sphere which was in dead last place. (iMIS rounded out the bottom four.) Salesforce was somewhere in the middle (ranked 9th). What\'s interesting is that they did a size of org and recommendation analysis - to break down recommendations by size of organization. Raiser\'s Edge, for instance, did much better among large and very large organizations, and very poorly in small orgs (which probably shouldn\'t be using it anyway.) The reverse was true of Salesforce. (The numbers aren\'t always quite large enough for these to be solid, but it\'s a great indication of what\'s going on.) What can we say about the open source packages? There are only three in this race: CiviCRM (web) Organizer\'s Database (desktop) and MPower Open (client/server). CiviCRM and ODB were at the top of the pack in terms of popularity, reccomendations and grading, and MPower had very few respondents who used it, and it wasn\'t included in the ones that were ranked. But its safe to say that these are good contenders, and did well. Last but not least, the grading. Who\'s going to get into med school? DonorPro and NEON CRM are at the top of the class, and will, I\'m sure, get into Harvard Med. Donor Perfect, CiviCRM and Antharia\'s On Deposit have solid A\'s, and will for sure get in. There is a large group of packages, like Salesforce, ODB, Giftworks, that will probably make it, but they might have to settle for second tier schools. Raiser\'s Edge, eTapestry and iMIS are going to have to get themselves into a special tutoring program, if they have a hope of making it. And Kintera Sphere, I think, is going to open a car repair shop.

Continue Reading

Drupal and Postgresql

On 21 Nov, 2008 By mpm

A while ago, I joined a bunch of groups at groups.drupal.org, thinking I\'d pick up some interesting ideas, and meet some folks who were doing cool stuff with Drupal. One of the groups I joined (along with \"Drupal for Good\" and \"Drupalchix\") was the PostgreSQL group. Yesterday, in my RSS feed, this post showed up. It was the suggestion to remove PostgreSQL support from the Drupal core. I was always aware that Drupal supported PostgreSQL, and I didn\'t really have any plans to use it. And there are varied opinions as to it\'s usefulness (which I beg to differ on.) But as a long time lover of PostgreSQL, I couldn\'t let this drop. And, I\'d been looking for a good solid project to get me going in Drupal, so it looks like I found it. So I\'ve adopted it. But, it turns out that with Drupal 7 (the development branch) it\'s virtually impossible to install Drupal, and even though I did wrangle an install (all of the right tables seem to show up in the database), it doesn\'t actually work, and I can\'t yet figure out why. I don\'t yet really grok the structure of Drupal, so it feels like sorting through spagetti right now. There are several core modules with PostgreSQL problems in Drupal 6, so I might actually go back and work on those first, before I can think about tackling what\'s wrong with install.php and PostgreSQL.

Continue Reading

Bleary Eyed and geared up

On 19 Nov, 2008 By mpm

I don\'t usually title my tech blog entries with quite that sort of title, but that\'s how I feel after spending 3 days with one of the most fabulous groups of people I have had the honor of spending time with in recent memory. I was at the Nonprofit Software Development Summit, which was an event full of great sessions, meeting neat people of all sorts, and having lots of geeky fun. It was a great combination of really detailed tech learning (like I learned a really cool trick using JQuery to generate rounded corners, which is generally not an easy thing to achieve,) and big picture thinking. I got to learn a ton, and contribute a bit.  Sessions I went to included:

There are lots of great notes there if you missed those sessions, and I\'m looking forward to reading the notes from other sessions I wanted to go to, but missed. Now, I\'m just going to sleep.

Continue Reading

The social network commitment

On 16 Nov, 2008 By mpm

Getting involved in a social network, whether it be something like Facebook or Myspace, or a content-connected social network like flickr or delicious (I\'m starting to get used to writing that without the dots,) is pretty easy. But there are SO many, and they all have their pros and cons. What I have learned, though, is that a social network is only as good as something that you have absolutely no control over: how many of your real friends and colleagues use it. Sure, you can join a social network, and \"friend up\" a bunch of people you don\'t know. Perhaps you\'d meet some cool people. But you\'d primarily be wasting lots of time. And if you\'re a nonprofit trying social networks out to figure out how to leverage your modest resources for maximal impact, it\'s really important to know where your constituents are. Over the last two years, I\'ve joined more social networks than I can count (even after I vowed, and only a couple of times violated my vow to only join social networks that were based on open standards, like OpenID and ODD (Open Data Definition.)) The content-focused networks, like delicious, slideshare and flickr, I generally use as primarily a one-way method of publishing specific kinds of content to people I know (and, of course, people I don\'t know, since it\'s public.) I\'ve learned that there are only a few that I really need to bother with:

  • Facebook: I consider it a watershed moment when my partner joined Facebook last week. The majority of people who are my Facebook friends I\'ve actually met in person, and a surprising percentage of my actual, real, in person friends are on Facebook (considering that I am a relatively old fart of the Facebook set at 49.) I\'m not bothering with MySpace, Orkut, etc. etc. If, perchance, there was a wholesale migration of my friends to a new platform, I\'d certainly move, but it makes no sense to join a social network that might be more open, for instance, if no one I know is there.
  • del.icio.us (sorry, I couldn\'t help it): I actually barely use the social networking capacity of delicious. I use it as both my personal repository of sites I want to keep tabs with. I know it\'s public, and it also serves to share with people interesting stuff I think is worth looking at.
  • Flickr: I also don\'t use the social network capacity of flickr much, except to keep track of the photos of a few real friends and family.
  • Twitter: The nonprofit technology community has chosen twitter as the microblogging service that it uses, so even though I use ping.fm to send status updates to plurk, identi.ca, rejaw, and some others, I never actually go to those sites. Very few people I care about are there (and they twitter too, anyway.)
  • Slideshare: Again, a service I hardly use for social networking - I use it to make public presentations that I\'ve done.
  • LinkedIN: The professional, serious, network. I hardly use it, but I know it\'s there, and it can be useful sometimes.
  • Plaxo: Once just my address book backup, it seems to now have become a social network on it\'s own. I only agree to be friends with people on Plaxo who are actually already in my addressbook (or I know should be.) That keeps the address book more likely to be correct. I don\'t want or need Plaxo to be anything else, thankyouverymuch.
  • FriendFeed: The compendium, with comments and likes. It\'s great that I can follow all of the content (blogs, tweets, Flickr photos, etc.) of people that I want to all in one place.

An oddball one:

  • Seesmic: I am completely conflicted about Seesmic. For those of you who don\'t know Seesmic - it\'s a video conversation social network. I\'ve had some great conversations with people (including Deepak Chopra, who seems to not post much anymore.) It\'s fun, and I love the idea, and I think it has the potential to be very powerful. But, I have to say that it feels like 85-90% of the conversations on Seesmic are, well, inane. There are some great exceptions to this, like a recent conversation about electric cars. But then it seems like with interesting conversations, some guy pretenting to be a robot, or someone else will post something completely inane, and then it devolves from there. Of course, some large percentage of tweets are inane as well, but there isn\'t the same overhead. It will take me half a second to scan the \"I\'m cleaning my garage\" tweet (and another second more to scan the responses, if any,) but do I really want to spend 5 minutes hearing about it? And spend the time playing the responses to it? Not hardly. Also, unlike the others, there really isn\'t a nonprofit technology presence (who has the time?) So conversations I care about aren\'t really going to happen there until that changes.

Continue Reading

What is \"organic\" software?

On 15 Nov, 2008 By mpm

I was perusing the program for a local \"green\" event, when I noticed a full page advertisement for Firefox, saying that it\'s software was \"100% organic.\" I kind of chuckled. I thought, what does that really mean? For Firefox, it means, \"open source, community-powered.\" And I realized that they had an interesting point. In my mind, it harks back to the arguments that Yochai Benkler made in his book \"The Wealth of Networks,\" that a ecosystem full of open source, community-powered software was, in a sense, more sustainable, and promoted more, not less innovation than the proprietary software ecosystem. So now I think I agree with the Mozilla Foundation that a good metaphor for open source as any is that it is to software what \"organic\" and probably \"fair trade\" is to food. Too bad the metaphor doesn\'t go both ways, because then organic and fair trade food would be free, too. And, like both of those labels are complex and not entirely easy to nail down with food, so it is with software. But I think it works.

Continue Reading

{.post-icon .standard}

Drupal Themeing, and other projects

On 13 Nov, 2008 By mpm With 2 Comments

I\'m learning Drupal bit by bit - one of the first tasks was to learn how to make a new theme. It\'s one of those things which is actually fairly straightforward-seeming ... until you hit a snag. And then it\'s opaque. One thing I learned is that it is incredibly sensitive to typos. One space accidentally inserted between the \"\<?\" and the \"php\" led to a completely blank page. Ah well. I\'m certainly learning what mistakes can lead to what kinds of issues, which is good. Eventually that becomes second nature. But, in any event, by the end of an hour or so of hacking, I\'d turned a template that I found online at Open Web Design into a Drupal template. I felt accomplished! I\'m going to do a few more, and see how sophisticated with it I can get. One thing I ran into (and haven\'t been able to solve yet) is that it\'s not easy to have navigation that requires more than just the standard \<ul>\<li> tags. Adding \<span> tags, for instance (which makes possible some more interesting looking navigation buttons) seems, at least at first, far from trivial. I\'m making a list of little(ish) projects that I want to do - sort of like problems I think I want to know how to solve.

  • Drupal and google docs single sign on. There is already a SSO Module for Drupal 5.x, and someone submitted a patch for it, but it\'s still up for review. I\'d also have to cough up \$50/year to get a google account that has the SSO API, but it might be worth it.
  • Drupal sidebar connecting with the NPR API - perhaps to provide a targeted news stream?
  • Doing a google map mashup of data in Drupal
  • Working with getting flickr photostreams to show in Drupal

I\'m still looking for a good project to try out in Cake. Unfortunately, the module Drake, which is meant to be a bridge allowing you to run Cake applications within Drupal, seems moribund. There is only a development snapshot for the 5.x branch, and no one seems to be picking it up for 6.x. Sigh. There is, for sure, another whole blog entry about Drupal modules.

Continue Reading

Going out on a limb

On 06 Nov, 2008 By mpm With 2 Comments

I\'m going to go out on a really thin limb here, and feel free to saw it off in the comments. :-) If you haven\'t been to change.gov yet, you need to go. Now. I\'ll be here when you come back. There is little question that Obama was Presidential Candidate 2.0. And it\'s becoming increasigly clear that he\'ll be President 2.0. What made this possible? Of course, without his intelligence, and desire to be involving and inclusive, it wouldn\'t have happened. But there is no question that there is a technical aspect to what made this possible. New technologies, the web, Web 2.0 services like Twitter, Flickr and Facebook, text messaging, all of these made this possible. Plus some amazing underlying technical infrastructure. It engaged voters (largely young voters, but others as well.) It allowed people to get involved and helped motivate. So, to go even deeper, what made all of this possible? Well, Web 2.0 depends largely on two things: open standards, and open source software. It is my arguement that without these two things, Obama would not have been able to harness the technology in the way that he did. He might have won anyway, but I think that these two factors made it a lot easier. And I think that they will be key to providing Government 2.0, which is as technically transparent and open as it hopefully will be in actuality. Open Source software and open standards are the foundations of Web 2.0. Open standards are now becoming de-riguer for application developers, and even proprietary vendors are adopting longstanding ideas and methods from free and open source software. I think the next 4-8 years are going to prove Yochai Benkler right.

Continue Reading

Cake vs. Symfony

On 05 Nov, 2008 By mpm With 10 Comments

In my new explorations of PHP web application development, it seemed a good idea to get a look at both CakePHP and Symfony. Both of them seem to be PHP\'s answer to Ruby on Rails. The approaches are similar and different to each other. I set up both on my laptop, and tried out some really simple app development. In Cake, the database build is separate from the application building (you do it yourself), whereas in Symfony, you use Symfony to build the database with schema files written in YAML. Then, you build forms and such using the schema as a foundation. They both use the MVC pattern, and both use object oriented PHP, which is great. I got a lot further with Cake in one evening of playing with both than I did with Symfony. At this point, I really prefer Cake - it feels like it jives with my own coding sensibilities better. I also don\'t like the overhead of learning YAML. I can imagine, though, that the Symfony approach can be powerful. Looking at Ohloh, Cake is more popular than Symfony (on Ohloh, who knows about in general), but Symfony has a lot more developers (81 vs 17). They both have good documentation and active communities. For now, unless something strange happens, I\'ll settle on Cake - although I\'ll not be spending too much time on it, since I\'m working hard to grok Drupal. But perhaps a cool project will manifest, and I can use it. Update: I learned that Yahoo and delicious have a huge investment in Symfony (which, I guess, might be why they have so many more developers.)

Continue Reading

Tidbits

On 05 Nov, 2008 By mpm

  • One of the underlying stories of the 2008 election victory of Barack Obama is the really intelligent use of technology, in a way that will permanently change how campaigns are run in the future.
  • Open Source Hardware: can it be done? I hope so, and I\'m looking forward to see its progress.
  • There have been a number of possible \"Exchange Killers.\" Open-Xchange just got a bunch of \$. Perhaps it\'s the one?
  • The Free Software Foundation revised the GNU Free Document License (GNU FDL) to allow public Wikis to relicense their content (by August 1, 2009) to the Creative Commons By-SA 3.0. Apparently, they were asked by the Wikimedia foundation to do this. The CC By-SA is the most FDL-like of the CC licenses.
  • Firefox 3.1 adds a very cool tab preview function. Woo hoo!

Continue Reading

It\'s the economy ...

On 03 Nov, 2008 By mpm With 4 Comments

As I said last week, today is my day to host the Carnival of Nonprofit Consultants. I chose this as my question of the week: \"Is your work changing because of the economy? How? What adjustments are you making?\"

Somewhat tangentially related:

I have been doing a lot of thinking myself about the economic meltdown, and what it means personally for me as a consultant. I\'ve talked a little before about some changes I\'m making, both because of personal interest as well as what I feel is changes in the wind in terms the kinds of new priorities that might be emerging.

Continue Reading

{.post-icon .standard}

Carnival of Nonprofit Consultants ... Here! Monday!

On 29 Oct, 2008 By mpm

Monday (the day before the election) I\'m hosting the Carnival of Nonprofit Consultants. My focus: is the economy changing the way you work, or the way you think about your work? In what ways? If you\'d like to submit a post, do so by Saturday midnight. Go to BlogCarnival.com to submit your post using the form there or send an email to npc.carnival AT yahoo DOT com with your name, your blog's name and the URL of the post (not your blog homepage).

Continue Reading

Firefox add-ons to love

On 23 Oct, 2008 By mpm With 4 Comments

I\'m in love with Firefox. I\'ve actually been in love with Firefox since 3.0, when it seems like a few of the things that plagued it finally got ironed out. More and more websites are designed not only with Firefox in mind, but sometimes even primarily with Firefox. Here\'s a short list of the Firefox Addons that I use all the time:

  • Firebug - it\'s a great tool for HTML/CSS/JS development
  • Web Developer - a nice toolbar, also useful for development
  • Google addons, including the toolbar, and Google notebook add on (although I\'m beginning to use Evernote more, because I have a copy of notes both on my desktop and online.)
  • Feedly - it\'s a really nice tool for making your feeds more readable, and it syncs with google reader, so that when you read something using Feedly, it\'s marked as read on Google Reader.
  • Fire.fm - I\'m in love with both Pandora and Last.fm. Fire.fm provides a nice toolbar - where you can play stations from, etc. It\'s a nice integration.
  • Delicious Bookmarks - this is the official plug in from Yahoo. It\'s sweet - a button to easily tag the pages you are visiting, and a nice sidebar to see your bookmarks from.
  • I also use varied greasemonkey scripts to make things more interesting.

There are a gazillion (well a lot) of add-ons for Firefox (and other Mozilla tools as well.) What do you use?

Continue Reading

OpenOffice.org goes Aqua!

On 21 Oct, 2008 By mpm With 1 Comments

As many of you know, I have been using OpenOffice.org, the free and open source office suite since before it was OpenOffice.org. That would be when it was Star Office. That was a long time ago. So I\'ve seen it develop and change (and helped a tiny, tiny little bit along the way by submitting bugs.) The Apple OS has been the poor stepchild when it comes to OpenOffice.org for a long time. With Windows and Linux, there were native versions that were easy to install and use. With OS X, you had to either use the most recent version of OpenOffice.org with X Windows, which most Apple users don\'t use, and didn\'t have the nice Aqua window dressing, or you had to use NeoOffice, which was steps behind OOo, and had some serious memory leak problems (it got better over time, but it still was pretty unstable last time I used it.) But, while I was busy doing other things like moving, OpenOffice.org released version 3.0, and with it, native Aqua versions for Intel and PowerPC Macs. Can you see me doing a happy dance? OpenOffice.org has been a great alternative to Microsoft Office for Windows users (and really the only full-featured office suite for Linux users.) But now, Mac users don\'t need to sacrafice to get the latest OOo goodies.

Continue Reading

{.post-icon .standard}

New community focus at MPower Open?

On 21 Oct, 2008 By mpm

For a while, I\'ve been watching MPower Open, the (not so) newly open sourced (but Windows-based) fundraising package. In general, I\'ve been impressed by its feature set, and that they made the choice to go open source. MPower has traditionally been used primarily by faith-based organizations, (by the way, that\'s not a small niche, even though it is one that is somewhat neglected by the nptech community,) and they have been quite committed to expanding from that niche. One of my prime concerns once MPower went open source has always been \"how are they going to build a community of users and developers?\" They have a tough road ahead of them. First, it\'s a hard road for company-based open source projects, but luckily, there are a few that have done some road-building ahead of them (see below). Second, Windows developers (and savvy users), for the most part, aren\'t used to open source communities (DotNetNuke is one exception, and there are other projects with some Windows ports and components,) and open source developers and users are primarily used to working on Linux (and Mac) platforms, so building a critical mass of interested developers and users is going to take work (it takes work anyway, but it will take more.) At present, on their Sourceforge page, there are a very, very few forum posts, no bugs reported, and many days with few if any downloads. This is not the sign of a healthy open source community. But, perhaps, there is change on the horizon. MPower announced today that they  are hiring a new VP of community. His name is Matt McCabe, and he is very familiar with the nonprofit fundraising space, having spent time as a consultant at Convio. I had a great chat with him yesterday, and was impressed by his background, knowledge of the sector, and his committment to grow the community around the MPower Open product. He has a lot less knowledge of open source communities, and how they work, so I have some homework for him:

  • If you only read one blog, read Matt Asay\'s The Open Road. Matt Asay is a key member of the company behind Alfresco, an open source Content/Document Management system.
  • Have a chat with the people at SugarCRM - both the company, and developers/partners. They have built a fascinating ecosystem around what is basically a commercial product (with an open source version.)
  • Have a sit down with some of your current partners, including the engineers behind the managers. Find out what they want and need.

Need more homework? I know a consultant you can hire ;-) In any event, I\'m quite pleased to hear that they have moved in this direction, and I am really looking forward to seeing what comes of this. If they can really move this forward, it would be fabulous to have feature rich open source CRM options with healthy and vibrant user and developer communities in both the web based, and client/server spaces.

Continue Reading

Drupal and WYSIWYG editors

On 16 Oct, 2008 By mpm With 3 Comments

I think that if I had to pick only one thing that would help people understand the character of Drupal, it would be the WYSIWYG editor that comes standard with an out-of-the-box Drupal installation. That would be NONE. There is no standard WYSIWYG editor that comes with Drupal. You have to figure out how to find one, and install and configure it yourself. So if you want to start adding content to your new site, and you need a little formatting, or a picture, etc., well, unless you know a bit of HTML, you are S.O.L. On the other hand, this is actually, from my perspective, a really good thing (can you tell I\'m becoming a Drupalista?) There are several to choose from, and they differ both in difficulty to get installed and working, as well as features. Want something barebones? There\'s one for you. Want something with all of the bells and whistles? There\'s one for you, too. There is a great review of five of the major ones. I\'ve been getting to know a few of them (and, yes, they can be a pain to install, and they depend, generally, on other libraries that you have to install as well.) I don\'t have a favorite yet, but I\'m thinking I don\'t need to have one - just to know which ones are well-maintained, and what the differences are in feature set. Then I can choose the one that makes the most sense.

Continue Reading

{.post-icon .standard}

Nuggets of news from the open source world

On 09 Oct, 2008 By mpm

  • This is old-ish news, but the acquisition of companies behind open source software by big behemoths continues with the acquisition of Jabber by Cisco.
  • Matt Asay makes some interesting points about the fact that proprietary vendors spend time and effort protecting their investments in their proprietary tools, rather than focusing effort on looking toward the future. He says: \"Red Hat and other open-source companies, in other words, are focused on the future, because that\'s what their model requires in order to earn renewals from customers. The proprietary model is more about \"build once, charge everywhere...and as long as you can.\" It\'s a great model for the vendor, when it works, but it encourages stasis in markets and silly lawsuits designed to horde, not grow customer value.\" At this point, proprietary vendors in our nonprofit neck of the woods aren\'t spending time litigating (thankfully) but I\'d argue that for a while, at least, stasis in markets was most definitely encouraged. Things might be shifting, though, due to both open source and SaaS as catalysts for change.
  • Some folks think that the more users an open source project has, the better, and the healthier the ecosystem. I agree, and I think that bodes well for us as more and more open source software is adopted in the sector (like CiviCRM, for example.)
  • It doesn\'t have to be SaaS vs. Open Source, it can be SaaS and Open Source. SugarCRM is moving forward in that direction, which is cool. Too bad CivicSpace OnDemand is dead - could have been an an avenue for CiviCRM.

Continue Reading

{.post-icon .standard}

The joy of Drupal (and other tales)

On 08 Oct, 2008 By mpm With 2 Comments

I\'ve been working with Drupal a fair bit over the last few weeks, with the ultimate goal to basically be able to really work with it to create sites. I converted my (very simple) consulting site to Drupal, without any bells and whistles. I\'m working now on a site that needs some bells and whistles, like translated pages and a WYSIWYG editor (ah, the WYSIWYG editor thing in Drupal is going to get its own blog entry, I\'m sure).

My next step is to try and create a simple theme (so I understand how theming works,) and write a module (so I understand how modules work.) There are still lots of things that are mysteries to me, but perhaps I\'ll learn more at Bay Area Drupal camp this weekend.

I\'ve also been digging a lot into the new(ish) social network software Elgg, and beginning to develop a members-only site with it for a client. In my estimation, it\'s amazingly promising, as a platform for interesting private sites. And, since it has OpenID (which seems to have bugs, though), OpenDD, and OpenSocial, it\'s not such a bad idea for public sites either (although I\'d still not suggest that nonprofits spend the time to do this.) Maybe someone will use it to create the Facebook killer (I kind of doubt it, but I can hope, can\'t I?)

As well, I\'m still honing my WordPress skills, mostly in the realm of dealing with themes and moving the darn things around and upgrading from ancient versions. I\'ve done some theme hacking, but haven\'t yet written a plugin (I can\'t think of one to write that hasn\'t been written yet.)

And, on top of all of that, I\'m re-writing in PHP some core functions of a web-database/CMS framework I wrote in Perl a gazillion years ago (and still is in use for an application called EASE.) That\'s been fun. What\'s also fun in retrospect is that the framework (the erstwhile Xina) was written basically using the MVC architectural pattern before I knew it existed!

It feels like I\'m beginning to focus on a core set of tools and technologies (PHP, Drupal, WP, Elgg,) and that in a few months, I\'ll be up to speaking PHP fluently, like I spoke Perl a while back. And I\'m looking forward to getting to work on the kinds of projects that I\'ve been watching as a spectator in the last year.

Continue Reading

Build vs \"Buy\"

On 08 Oct, 2008 By mpm With 1 Comments

I keep being surprised by how frequently I hear clients tell me that a vendor has suggested they \"build them a CMS,\" or by proposals from vendors that include custom building a CMS. I hear people suggesting building their own social networking website. I even occasionally still hear tell of organizations who want custom CRMs. The web software landscape has changed dramatically over the years. Five years ago, it was full of custom built systems of all sorts - and the \"build vs. buy\" decision was, I think, more difficult, because the available software to buy was fairly cruddy. (And, for the purposes of this post, I\'m using \"buy\" exceedingly loosely - including purchasing proprietary software, installing open source, or using SaaS.) But the landscape is different now, and I think that, in some senses, the \"build vs. buy\" decision is much more straightforward. First, the software available, whether it be open source, SaaS, or proprietary, is much better all round. There are new types of software being developed all the time (like, for instance, the new crop of \"Social Network Management Systems\" both open source and SaaS, like Ning.) In addition, the increasing openness of software, whether it be open source, or open platforms like Salesforce.com, means that customizing software to your needs, or integrating different pieces is much more straightforward, meaning it\'s a lot easier to create exactly what you need by integration or customization, rather than building from scratch. This is not to say that there is no role for custom built applications. I\'m in the process of working with two organizations to create just that. But they are both for quite highly specialized functions. And I\'ve also been involved in projects to create interesting and somewhat customized web functionality - but those are being done with adding custom modules to an open source CMS. From my perspective, exhaust all of the \"buy\" options: open source/proprietary/SaaS out-of-the-box, customized open source/SaaS, or integration of already existing components, or building modules on top of open source tools, before you take on building something new from the ground up. You\'ll save money and time, as well as be able to take advantage of an upgrade path as web software changes and improves, meaning you won\'t have to build whole systems again.

Continue Reading

Next up ...

On 02 Oct, 2008 By mpm With 5 Comments

I used to spend most of most days hacking (mostly Perl) code. It had its ups and downs, although in retrospect, the downs weren\'t really about coding. I haven\'t done daily coding now for about 3 years, and I\'m missing it, terribly. So ... I\'m going back to it, slowly but surely. I also have to admit that my gut tells me that in the coming economic landscape, going back to using my coding skills will likely increase my chance of keeping myself in iPhones, BeagleBoards, and microbrews, as well as the more necessary, but boring things like keeping a roof over my head. Strategic planning is already something that\'s somewhat of a hard sell for organizations. Methinks its only going to get harder as grants and donations start to dry up. In my last post, I was talking about platform choice, and although to some extent, I can appreciate the argument that Python is a better language than PHP (just like in 1999, Perl was a better language than PHP was at the time.) However, PHP is the basis for Drupal, which is inarguably the most popular open source CMS system, and WordPress, inarguably the most popular Blogging platform. It\'s also the basis for Joomla, a CMS I appreciate. There are also some very cool PHP development frameworks, like Cake and Symfony, that I\'m excited to explore. It\'s also the basis for CiviCRM, a project I\'d love to be able to contribute code to. I\'m psyched to learn jQuery (OK, that\'s not PHP, but I figured I\'d stick it in there.) And I don\'t have to learn a new language (I\'ve done a fair bit of PHP some time ago, and it\'s not so unlike Perl.) So PHP it is, starting with Drupal. So my first steps are to find some projects to help out with, volunteer for, etc. and take it from there. Maybe start doing some work with CiviCRM. It\'s such a different landscape than it was even 3 years ago. But it\'s a landscape that presents itself with all kinds of amazing possibilities for creating totally amazing applications that we couldn\'t even dream about a few years ago.

Continue Reading

My tool is better than your tool...

On 01 Oct, 2008 By mpm

Over the past year and a half, I\'ve been fulfilling a definitely different role with nonprofit organizations than most times in the past. I\'ve been an intermediary, rather than an implementor. In this role, it has been my task to provide advice for organizations around technology choice and vendor selection. Many times, I narrow down the technology options as a part of the RFP process. I do this based on my knowledge of the options out there, my own opinions about them, and, most importantly, the feature match. For many projects, a wide range of options are possible, and in talking with vendors who specialize in one toolset or another, I\'ve been intrigued by the ways in which vendors talk about their chosen tools. For some projects, there is no question that one tool may be better than another. But for a lot of projects, what\'s way more important than the tool is the approach of the vendor/developer, and the quality of the work they produce. And some things surprise me. I am actually still surprised at how many small vendors are still selling their custom CMS. Having written and maintained my own for a few years, I know that the investment is hard to let go of. But in terms of long term sustainability, from my perspective, picking one of the well developed open source CMSs and running with it, can\'t be beat. There will be an initial investment of time, but the time savings later, and the added opportunities will almost inevitably outweigh the cost of maintaining and improving (as web technology gets more sophisticated, clients expect more from their websites) your own. And I guess what\'s less surprising is that people are pretty wedded to their toolset, and ready with long lists of arguments as to why theirs is better. I\'m sure that when I was doing implementation, I focused some energy on \"why my tools are better\" (and, actually, I was right and wrong at the time. For instance, I chose perl over php and postgresql over mysql in 1999.) I know that\'s just part of the package of being an implementor. Some arguments I can certainly appreciate better than others (the Python vs PHP ones are fun.) But I\'m sorry, I\'m not going to be convinced that ColdFusion is a platform I should choose. I mean, it doesn\'t even have objects! (That\'s actually not the most important reason, but the fact that a web development platform that has been around for 13 years still doesn\'t have objects is telling.) And as I think about going back to doing implementation, platform choice is certainly something to ponder. (More on those thoughts in a forthcoming post.)

Continue Reading

{.post-icon .standard}

NOSI and Aspiration join forces, yay!

On 26 Sep, 2008 By mpm With 2 Comments

I\'ve been working with NOSI (the Nonprofit Open Source Initiative) for more than 5 years. In addition, I\'ve worked with Aspiration a lot in the past few years as well. It is a great pleasure for me to announce that NOSI is becoming a project of Aspiration, and I am re-joining the board of Aspiration. I think the two organizations together will provide a really strong focus for advocacy and resources for open source development and use in the nonprofit sector. See the press release for more detail.

Continue Reading

SaaS vs. Open Source

On 24 Sep, 2008 By mpm With 8 Comments

I just finished writing a post for the Idealware blog about choosing SaaS vs. Open source. I said in that post:

At one level, whether or not the software underneath the SaaS is open source is not relevant. You are not obtaining the software, and whether or not you can see the code, or modify it, is really not the key issue here.

And, at the level of most nonprofits choosing software, this is, in fact, correct. But the real story is much more complicated. SaaS is not, by definition either proprietary or open source. There are a few examples of SaaS that are based on open source projects, although most SaaS are proprietary - the code is never meant to be released. One of the most important things to understand is that SaaS is primarily built upon open source tools such as Apache, MySQL, and MySQL. It would not be as cost-effective (and thus, not produce as much profit) if these SaaS developers had to pay license fees for the software they use (besides the fact that these are the most stable and robust platforms to build upon.) So SaaS vendors are taking good advantage of open source software, and, in many situations, not giving a whole lot back. This is not uniformly true - some SaaS vendors give back in a variety of ways - contributing code back to those projects, having their own programs to give back in some way or another (like Google Open Source, or Salesforce.org) Bur in any event, SaaS based on proprietary software violates the basic software freedoms - you can\'t use it freely, you can\'t see the code, you can\'t modify it, and you can\'t release the modifications to others. And, in some situations the existence of SaaS can inhibit open source development in the spaces in which it is popular, especially if the SaaS is cheap or free (how many good open source webmail clients are there, for instance?) From my perspective, the key is openness. Some SaaS, like Salesforce.com, and increasingly the nonprofit CRM SaaS vendors, are open platforms. From my perspective, it\'s all about balance, and having an active ecosystem, with healthy open source options present. The more SaaS vendors can contribute to and not detract from that ecosystem, the better.

Continue Reading

Welcome to the new theme!

On 22 Sep, 2008 By mpm

I changed my Word Press theme, mostly because I was getting a bit tired of the old one, and wanted something really simple. Also, it coincides with a new installation of Drupal for my consulting website, using the same basic template (called \"Blueprint\"). I\'d been working with Drupal for NOSI (we\'ve had the site in Drupal for a while now,) but I\'d never installed and configured and messed with Drupal from scratch, so I did that. So far, it\'s been largely painless. I\'m quite excited about really getting my hands dirty working on some plug ins, or some such, really learning the ins and outs of Drupal. It will be interesting to learn the innards of a CMS. The last time I coded in a CMS was when I was working on the (now dead) perl CMS I wrote many moons ago. So I\'m polishing off my PHP skills, and we\'ll see where that leads.

Continue Reading

Tidbits

On 19 Sep, 2008 By mpm

Here are a few interesting tidbits that have come my way over the course of the last few days

  • There is a new online fundraising platform, founded by an ex-Convio person. It\'s called Kimbia, and it\'s got some interesting features, and seems to be focused mostly around creating campaigns. The interesting thing, too, is that their model is that they take a percentage of what you raise (5%.) No setup fees or anything else. That\'s an interesting model, and, if the software is decent, sounds like it could be a good option for some organizations. But, of course, as with anything, look (a lot) before you leap.
  • There are some fabulous sessions proposed for the 2009 NTC, running the whole gamut of nonprofit technology, from planning to Web 2.0, open source, etc. Vote for them, especially the one I\'m helping out with (David Geilhufe is the spearhead) on open source CRM.
  • I\'m really psyched about the Nonprofit Development Summit, happening here, in my (current) fair city, Oakland (I love it that I don\'t have to travel so much anymore.) Funny thing: it\'s happening the same week as the Convio Summit, and the Blackbaud Conference. Go figure. Anyway, if you are involved in any way in developing software for nonprofits (and, I guess, you don\'t code for Convio or Blackbaud, or, heck, even if you do) please make your way out to the left coast for Nov 17th.
  • Speaking of Convio, Salesforce\'s M.Tucker McLean weighs in on Common Ground, Convio\'s new (frackin\' brilliant) fundraising app written on top of Salesforce. I\'m still watching the fallout on this one. It\'s going to be interesting. Under my hat is a blog post about open source and SaaS. It might be interesting.

Continue Reading

What are learning platforms?

On 19 Sep, 2008 By mpm

Note: This blog entry was originally posted on Idealware\'s new community blog. I\'m honored and happy to be contributing blog posts there. Nonprofits have become intimately familiar with Content Management Systems (CMS). Some, especially those that are very content/document heavy, have become familiar with Document Management Systems (DMS). What they might not be so familiar with are Learning Management Systems (LMS). An LMS, or learning platform, is a system that is designed to facilitate some sort of learning process over the web. What can an organization use a LMS for? Well, that depends of course, on the organization. Many organizations provide trainings, courses, and varied sorts of learning activities to their constituents. If your organization does this at all, and you are interested in investigating how to enhance or deliver those learning activities through the web, an LMS is for you. Learning platforms, of course, are varied. They can be very complex course management systems which are designed to do things like quizzes and grading (as well as discussions and have places to store course materials.) Some have fewer features, but all have some basic qualities:

  • Ability to handle multiple courses (or activities) and enroll individual students
  • Courses can be done either with specific deadlines, or at any pace
  • Course calendars
  • Messaging (between teacher and students and between students)
  • Group discussions
  • Document repositories
  • Assignments and grading

Most post-secondary institutions are using LMS for management of both regular and distance learning courses. Increasingly, a wide variety of training opportunities are being delivered via LMS. So what LMS should you look at? There are both proprietary and open source LMS. Most proprietary LMS are geared specifically toward the college/university or corporate training markets, and are thus quite expensive. These include Blackboard, ANGEL, and WebCT (now owned by Blackboard), among others. There are open source LMS, some of which are quite well developed, and some also geared toward (or developed by) colleges and universities.

  • Moodle - probably the best known open source LMS. It is very easy to install, and there are an increasing number of consultants and companies offering support for Moodle.
  • LAMS - not as much an LMS as a curriculum development tool. Can be powerful if integrated with Moodle
  • ILIAS - developed by the University of Cologne, is another open source LMS

Continue Reading

Find me in my \"office\"

On 19 Sep, 2008 By mpm

NTEN has a great program, called \"Office Hours\", where folks can talk to people and get their burning questions asked about everything nonprofit technology. I volunteered to be the \"expert\" in residence for the \"Program\" track of Office Hours. The description: \"[Talk with Michelle about internal software systems - document and knowledge management, CRM, client management databases, intranets, etc.\"]{.event-description} So, come join me. Fridays, 10:00am PT/1:00pm ET

Continue Reading

{.post-icon .standard}

Social Network Management Systems?

On 11 Sep, 2008 By mpm With 6 Comments

I have been thinking about the software tools we call \"Management Systems\" - like Content Management Systems, Document Management Systems, Learning Management Systems... I\'ve also been playing a lot with an open source tool called Elgg, and have also played, in the past, with Crabgrass, another open source ... \"SNMS\"? What do these tools allow you to do? They allow you to create stand-alone social networks. Think a whitebox version of Ning, or Facebook. Elgg, a LAMP(hp) project, started it\'s life as a learning platform with social network features, but has transmorgrified into a social network platform with learning features. It\'s definitely a new project, and a very new community (with some huge warts) but it is promising for organizations that want to create private (or public, perhaps) social networks that include groups, discussions, document sharing, bookmarking, blogs and other things. Crabgrass is written in Ruby on Rails, and has groups, messaging and wikis, among other features. It\'s a particularly interesting project, because it has a definite political purpose:

Designed for social movements working for social justice, Crabgrass will consist of tools which allow people to connect, collaborate, and organize in new ways and with increased efficiency. In particular, we aim to help groups to communicate securely, network with other groups, and make decisions in a way that is easy to understand, transparent, and directly democratic. Where traditional social networking is about connecting individuals, Crabgrass is social networking to facilitate network organizing.

In the end, I don\'t advocate that organizations build new public social networks in the vast majority of situations - I think they should find the people where they already are. But private social networks have their place, and can provide a compellingly interesting platform for our nonprofit standard \"members only\" websites. People are getting more and more used to social networks as the vehicle for connecting to others, and this is one way to provide this in a private setting.

Continue Reading

Google Chrome

On 04 Sep, 2008 By mpm With 2 Comments

The hiatus is over with a short entry about Google Chrome, the new browser from Google that I learned about on the twitterverse while I was stopped at one rest stop or another. I can\'t test it, because my Mac that has a Windows virtual machine is packed. But I will say this: that doesn\'t matter. I won\'t be downloading it, or trying it, even when they release Mac or Linux versions. Why so curmudgeonly you ask? It is open source, after all. And it has some cool features. Yes, it is open source, and I applaud Google for releasing open source software. However, there was an initial brou-haha about the EULA, which initially suggested that everything you type into the browser belongs to Google (talk about All Your Base Belong to Us!) Yes, they changed it, but it made me realize that it is a Bad Idea to put all of my eggs in one basket. Google already knows enough about me (it reads my mail, my feeds, my search history, and a few shared documents, to boot,) I\'m certainly not going to add virtually everything else I do (the percent of things I do using a protocol other than http(s) is dwindling by the second.) If someone releases a \"Chrome minus Google\" - that is, a version of Chrome with all of the \"phone home\" code completely eliminated, then I\'ll think about using that version, just to see what it\'s like. Otherwise, fuggetaboutit.

Continue Reading

Hiatus

On 26 Aug, 2008 By mpm

As many of you know, I\'m about to make a big move: to the left coast, to the big city of Oakland (or thereabouts). This is a good move for me, in many ways. In one particular way, I\'m psyched to get to be in one of the hot spots of my field, and actually have a casual beverage with lots of colleagues I\'ve gotten to know over the years, but have only seen once a year (or less often, even.)  I\'m driving cross country, leaving Friday, and arriving sometime during the weekend of the 6th and 7th of September. Taking a slow, leisurely drive. I won\'t be blogging here, and I know that my work will be cut out for me in terms of catching up when I land.  I\'ll be blogging on my personal blog as I drive (well, not while I\'m driving) and tweeting as well.   

Continue Reading

Speaking of open social networks ...

On 14 Aug, 2008 By mpm With 2 Comments

I just joined identi.ca. identi.ca is a microblogging service based on an open source project, Laconica, and all of the updates are copyrighted by a Creative Commons (Attribution) license. You can log in using OpenID. All really great stuff. I imagine, too, because it\'s based on an open source platform, developers will begin to code in data portability (or have they already?) The documentation is a bit lacking, and it\'s clear that it\'s a very new project. There are an increasing number of third party apps that can use it (it supports the Twitter API.) So I\'m on identi.ca now (follow pearlbear). Like all social networks, they are only as usable as people in your social graph use it, and it\'s pretty sparse for me right now. But hopefully that will change.

Continue Reading

WeAreMedia Project

On 13 Aug, 2008 By mpm With 5 Comments

Yet another great NTEN project is happening, spearheaded by Beth Kanter, to develop a Social Media toolkit for Nonprofits. It\'s called \"WeAreMedia.\" I have been really slow on the uptake with this project (it started while I was on vacation, and I never caught up after I got back,) but I hope that I will be able to keep up, and participate more fully in it as it evolves and grows. The first set of modules, on the \"Strategic Track\" are already done. The next set of modules, that are more tool based (with case studies) are next to be developed. The project took a short break to catch its collective breath, and Beth has some thoughts and ideas that came out of that conference call. I\'m excited about this project - it\'s gathering the knowledge and expertise of a great group of folks, and it will provide a free source of information and case studies so that nonprofits can best figure out how to step into the frothy waters of social networks. A note: Most people will notice that I have pretty much lost my curmudgeonly approach to social media. A year ago, I was bear-ish on Facebook, and said I wouldn\'t Twitter. Now, I tell my clients that they really have to think about a Facebook strategy, and that nonprofit staff can gain a lot from networks like Twitter. I\'m up to 1,700+ updates on Twitter, and keeping up with my lifestream on FriendFeed is about as difficult as knowing where I\'m moving to next. No, I didn\'t go soft in the head (well, some people might think so.) What has happened is basically a sea change in the landscape. Not only are Gen Y and Millenials engaged in these social networks, but a wider and broader range of people are. It\'s fairly clear to me that going forward, increasingly, social networks are a major way people are interacting on the web - and nonprofits need to understand how to engage their constituents given those changing realities. Of course, I\'ve been a wiki fan since the very beginning, and I haven\'t lost the desire for true data portability, and open source alternatives to the current social networks.  however, as you all know so well, I\'m a realist.

Continue Reading

Tidbits

On 13 Aug, 2008 By mpm With 1 Comments

I love these tidbits blog posts. I come across a lot in my varied net wanderings, and people send me stuff, so it\'s a nice way to talk about some of it, without having to go into too much detail.

  • I was profiled on Linux.com. That\'s kinda fun! I \"live the open source lifestyle.\" That must have been before the iPhone.
  • This blog is rated 8.2 (very good, they say) on Blogged.com. I\'m not sure exactly what that means, but I guess that\'s a good thing. And why is this in \"social issues\" and not \"technology\"? But, anyway,  I\'m not putting the silly badge in the sidebar, sorry.
  • MPower has started to generate some community-driven development. These look like pretty interesting, and useful applications. But I still haven\'t seen much activity on the sourceforge page, nor does it seem that the code for these new projects is available. So the reality is still pretty far behind the hype. I sure hope the reality catches up.
  • This news is so old it\'s embarrassing - NPR opened up their content API. Way to go, NPR!
  • Some really interesting things are brewing with CiviCRM. First, they are putting the wraps on a new version, and there are some interesting projects happening, like integrating voter files, phone banking, and my favorite, case management. I\'m excited to see what community-driven development can do!
  • A new site was launched called \"Green Nonprofits.\" It looks interesting. It looks to be a joint venture by a group of nonprofit-focused for-profits, none of which I\'ve heard of (which doesn\'t mean a thing, really.) Something to watch.

Continue Reading

My iPhone 3G

On 12 Aug, 2008 By mpm

I once joined the \"iPhone non-buyers\" group on Facebook. I swore up and down I wouldn\'t get one. I said:

I will not be buying an iPhone until they sell an unlocked version that doesn't need to be hacked to use third party applications.

Well, the iPhone 3G changed the equation, some. The iPhone 3G does have 3rd party apps, but they have to be vetted, etc. by Apple, which is a mixed bag. The iPhone 3G had some compelling features and 3rd party apps (like Pandora, which, I have to admit, is about 1/2 the reason I got the phone.) So, I got one, about two weeks ago. And it\'s been a combination of sheer joy, and sheer frustration. In general, the phone actually works quite well. The phone interface is great, visual voicemail rocks, reading email is really good, and surfing the web is decent - way better than on a blackberry. But the apps, and the app store (where to get apps) have been the cause of crashes, iPhone lockup requiring multiple restorations (which, luckily, are relatively painless.) There are some great apps and games, some really good for really cheap (or free.) But getting them onto the iPhone, or getting them to work, is sometimes a pain. Syncing is way slower than on a regular iPod, and backups can take 1/2 hour, which is absurd. There are multiple threads on multiple sites about problems with the iPhone 3G - clearly, some software fixes need to happen, and the faster they happen, the happier I\'ll be. But driving down the highway, listening to my favorite Pandora stations is the best thing since sliced bread. My suggestion: if you like gadgets, are addicted to being connected at all times, and you feel like you need reading glasses to read email or the web on the Blackberry screen, the iPhone might be a good bet, but I\'d wait until software version 2.1 comes out. And if you are eligible for an upgrade, it\'s cheap (relatively speaking for smartphones.) I know some folks are holding out for an Android phone, which, from what I can tell, are going to be pretty similar.

Continue Reading

What is cloud computing?

On 11 Aug, 2008 By mpm With 5 Comments

{.alignnone .size-medium .wp-image-287 width="229" height="300"} You\'ve likely heard a lot about \"cloud computing\". And what\'s true is that the sales-talk about computing in the cloud certainly makes the conceptual issues behind it, honestly, well, cloudy. So I\'m going to try and lay out the details of what  cloud computing is, and how it\'s useful for nonprofit organizations.

Quick definition: Cloud computing is basically running applications on the web via \"Software as a Service (SaaS)\". That includes applications from Google Documents, to Salesforce.com, to Gliffy.com, (the service I used to create that graphic.) It also includes applications that you might develop (or have developed) that are hosted outside your network.  That\'s really all it is - there isn\'t anything fancy about it. It still requires the hardware and operating systems, and databases that more traditional applications that are inside your network require, but, generally, you hand off that responsibility to the folks that host your application, and access the application through the internet.

Advantages to cloud computing: The basic advantages are that you don\'t have to maintain infrastructure for applications, saving you labor costs, as well as electricity costs. Also, you can access the applications anywhere you go. Disadvantages to cloud computing Depending on the vendor and the application, you are dependent on them to keep the application up and your data intact. Changes in the application happen without your knowledge or consent. Your data is not directly in your hands, but in the hands of a third party. You are dependent on your internet connection - which could be a problem for mission-critical applications.

What makes it possible: Cloud computing is made possible and easier by two trends, two that have happened closely in parallel, one that is relatively recent: High bandwidth to the curb and massive data centers. High bandwidth to your home or office is a necessary requirement to cloud computing. Cloud computing just doesn\'t make any sense, or work in any reasonable way without it (have you ever tried to use Gmail on dial up?) As the bandwidth available increases (via FiOS, and other methods) cloud computing will get even more attractive to organizations and people. Huge data centers are being thrown up everywhere, and more and more companies are getting into the business of providing hosting for SaaS developers. Companies such as Amazon are creating massive grid storage and computing services for applications in the cloud.

What makes it usable: Newer applications are using AJAX and Flash, to give the kinds of functionalities we\'ve come to expect with desktop applications - so it\'s just like having a desktop application with our data - except it\'s \"in the cloud\" not on our desk. As the limitations of both AJAX and Flash are overcome (and as both develop further) expect even more usability for online applications. And, further, efforts like Adobe AIR, and Microsoft Silverlight, are bringing full-fledged desktop application functionality to applications in the cloud. What you should do

  • Make an assessment - will using this online tool really save money or time, or facilitate collaboration in ways that is not possible with local apps?
  • Always read the privacy policy - if you have sensitive data, this might be a deal-breaker
  • Always maintain your own backups. If the provider goes belly up with your data, you\'re toast.
  • Make sure access is secure. Read up on the security of the application

Continue Reading

How do you keep up?

On 05 Aug, 2008 By mpm With 4 Comments

I have been thinking for the last few days about what it means to \"keep up\" with the technology field, particularly \'net technology. I\'ve been helping a client hire a temporary project manager, and so in the interview process one of the questions I ask is how people keep up with change in the field. In some ways, I have been blessed with the gift of osmosis. I\'m a fantastic book learner, which means I\'m a great blog/twitter learner, too. Also, one of the things I do is blog - so I regularly have to process and digest information to write decent blog posts. One of the prompts for this post was also that I\'m preparing a post on \"cloud computing\", just reviewing what it means, and what it is, and why it\'s important - and that will undoubtedly help me to keep up with that whole set of things. How do you keep up? And, in a bigger picture way, how important do you think it is for people who work with nonprofit organizations on technology issues to keep up? How bleeding edge do we need to be?

Continue Reading

On 30 Jul, 2008 By mpm With 2 Comments

Here are posts from a small sampling of bloggers I regularly read:

  • Holly Ross shows her geek cred in a great post about DNS. I love Holly\'s approach to technology.
  • Amy Sample Ward let\'s us know how to give green. Amy is a great source for nptech tips of all types.
  • Beth Kanter asks whether or not we can add some more steps to Chris Brogan\'s fabulous 50 steps to create a social media presence.  Beth is, of course, THE web 2.0 and social network guru for the nptech community.
  • Allan Benamer talks about Convio\'s new big client, Susan G. Komen for the Cure, for their Salesforce-based application formerly known as Aikido, presently known as Common Ground. I trust Allan\'s gut instincts about CRM technology.
  • David Geilhufe ponders how technology solutions split communities, and issues of a lack of strategic vision. It\'s a great, thoughtful post. David is always a great source for thoughtful insights about open source.
  • Michele Martin talks about developing a personal learning plan. I read her blog for great tips on professional development and technology.

Continue Reading

{.post-icon .standard}

Carnival of Nonprofit Consultants

On 27 Jul, 2008 By mpm With 6 Comments

This week, it is my pleasure to host the Carnival of Nonprofit Consultants. This week, I asked the question: What is the biggest mistake a nonprofit can make with their website. I got some interesting answers:

  • Ken, at the Nonprofit Consulting Blog, talks about transparency, and how it\'s a big mistake not to be transparent. He has some good ideas and suggestions about how to be transparent as an organization through the website.
  • Kivi in her Nonprofit Communication\'s blog suggests that a website needs to be about the visitor and not about the organization: \"The biggest mistake that a nonprofit can make with its website is to use it as an old-fashioned brochure, where you immediately hit the visitor with your long, jargon-filled mission statement, right at the top or smack in the middle of the home page, followed by bulleted lists of \'projects\' or \'services.\'\" She gives some great suggestions and examples
  • Joanne Fritz talks about three big mistakes - outdated information, insufficient contact information, and outdated design. She makes some great points, and gives good tips to make changes.
  • The Hack Artist suggests that it\'s important to marry direct mail efforts and a web presence.
  • James Young, on the Connection Cafe, suggests that we think about constituent empowerment when we think about organizational websites.
  • And, since I\'m the host, I get to add a couple of bonus mistakes. I think one of the biggest mistakes that an organization can make with its website is to promise more than it can deliver - make sure that the resources to create that blog, or podcast, or photo gallery, or whatever bells and whistles that you promise on your website, are there when the website goes live.
  • I do think the biggest mistake an organization can make in the re/creation of its website is to go with the vendor with the lowest bid. It\'s a lot more than price - it\'s quality of work, whether you like their previous work, their overall reputation, as well as their fit with you as an organization.

Continue Reading

Tidbits

On 25 Jul, 2008 By mpm

  • Between the Connection Cafe, and the new name for their fundraising database: Common Ground (formerly known as Aikido,) I have to admit that I\'m beginning to think of coffee, when I think of Convio. Is this a bad thing?
  • Myspace is going with OpenID! That\'s a great step. There are some other interesting moves outlined in that great post by Marshall Kirpatrick, my currently favorite ReadWriteWeb blogger
  • Android for the masses, iPhone for the rich? Read an interview of an Android developer. It\'s an interesting question, what Android might (or might not) turn out to be. So far, it\'s simply vaporware.
  • Not waiting for Android vaporware (my research suggested it was probably a long time coming, and would not be on my carrier, AT&T,) I decided to succumb, and buy an iPhone 3G. More on that in a later post.

Continue Reading

{.post-icon .standard}

The \"Open Source Software is Free\" myth

On 14 Jul, 2008 By mpm With 9 Comments

I had a startling realization a few days ago. I seem to spend inordinate amounts of time responding to people (proprietary software vendors, to be specific) harping on the idea that \"open source software is free\" is a myth, and blathering on about how it\'s not really free, because you have to hire a geek to install it, and maintain it, and blah blah blah. No credible nonprofit technology open source advocate has ever suggested that open source software was free to implement. In fact, we all go out of our way (like in the open source primer) to talk about total cost of ownership, and how cost-wise, implementation of open source software is sometimes a wash with proprietary, etc. I\'ve been caught using the \"free as in kittens\" metaphor many more times than once. We talk much more about the value and values that free and open source software bring to the table. My realization was this: the myth is entirely of the making of these proprietary vendors who claim it is a myth. There would be no myth if it were not for them. No one would think that anyone thought that implementing open source software was without cost. And from now on, instead of writing some long-winded response, I\'m just going to put in a link to this post.

Continue Reading

My Theory of Practice

On 10 Jul, 2008 By mpm With 2 Comments

I finally had the reason to begin to more completely articulate my theory of practice. My theory of practice is different than my consulting philosophy. They certainly are consistent with each other, but they are distinct. A theory of practice, in my mind, outlines the methods and ideals behind how I get work done with clients. This theory includes the following elements that I think are key to my work:

  • Listening. Listening, both at the beginning, and consistently through an engagement, to their goals, ideals, \"points of pain\", and points of confusion.
  • Educating. One of the most important roles I play is educating clients about the technology that they will be engaging with, based upon what I\'ve heard while I\'ve listened. This is also an ongoing process.
  • Intermediation. The role I play most often currently is providing a clear and understandable avenue between the client and a technology vendor (such as web or database development shop). The client is quite knowledgeable about their organization, mission, and goals for a project, but often not knowledgeable about technology. The vendor is expert at what they do, but cannot always provide a channel of communication that the client can really work with. I provide that clear channel, so both sides benefit.
  • Learning. Those first three elements make up the communication arm of my practice. The other arm is learning. I can\'t do what I do without being a technology expert. And I can\'t stay a technology expert without continually learning. Reading, research, collaborating with others, getting my hands dirty with servers and code, playing with new applications and new APIs - all of those things keep my technology expertise fresh.

More specifically, what methods do I use to help clients make their way through the entire process of a technology project:

  • Qualitative and Quantitative (where appropriate) assessment of requirements and needs, including surveys and interviews with internal (and/or external) stakeholders
  • Research - both standard internet research as well as outreach and interviews with relevant people
  • Writing - writing requirements, RFPs, documentation
  • Project management - keeping a project on track
  • Evaluation - evaluating projects as they are happening, and when they are done.

Continue Reading

MPower Open keeps moving forward

On 06 Jul, 2008 By mpm With 5 Comments

This is old news, but I\'ve been busy. What\'s kind of funny is that I was quoted on the press release, which was out 3 weeks ago or so, but it\'s taken me this long to blog it. Bad Blogger! (I think my clients thank me for being a bad blogger.) So what\'s the news? MPower Open is now on Sourceforge, they released their product under the GPL v3. These are good steps forward. This is what I said (in what I think is my first quote for a press release):

By adopting a well-regarded license, joining the SourceForge platform, and launching its community, MPower is making great strides in creating an open source community around its application," said Michelle Murrain, Principal, MetaCentric Technology Advising, and Coordinator, Nonprofit Open Source Initiative (NOSI). "I look forward to the growth of this community, and the ongoing development of the MPower solution as an open source alternative CRM for nonprofit organizations.

So my hope is that they really begin to use the platform. So far, there is basically no activity in the forums and mailing list. It\'s going to take some real elbow grease of reaching out to people who might begin to form the kernel of a development community to get that going. \"If we build it they will come\" only works in the movies.

Continue Reading

{.post-icon .standard}

What software freedom means to me

On 25 Jun, 2008 By mpm With 1 Comments

I got some interesting comments on the last post about Linux desktops. I realize that I haven\'t talked about this in a while, and I\'m not sure I\'ve actually ever articulated this completely on this blog. So here goes. I got involved in Linux a long time ago. I was a professor at the time, and a nonprofit organization wanted to get on the web, and give some of their staff email, and at the time, colleges and universities were the only organizations that had easy access to the internet, and virtual hosting companies cost a fortune, way beyond what a nonprofit could afford. The date was sometime in 1995. We set up a little box in the corner of my office, and loaded several piles of floppies containing the Slackware distribution onto this box. After a few hours (as opposed to the few minutes it would take now) we configured that server to hold a website and serve email. The old site is still up on the Wayback Machine. I co-administered that box for a few years. Eventually, they got a T1, and moved the server in-house. I left academia to do that sort of thing with nonprofits full time. In fact, that experience, and the work I did around it with that organization, was the first step into this whole nonprofit technology field. What I learned about Linux back then was that it was a way (along with the help of a college) for a nonprofit organization to get on the web easily and relatively inexpensively. It leveled the playing field, so that an organization without many resources could do what at that time, required a lot of resources. In many ways for me, the most important aspect of free and open source software is that it does just that - it levels the playing field so that people and organizations with few resources can have access to quality tools to do what it is they need and want to do in this software-driven world. I\'ve learned a lot about FOSS since then, of course, and the other aspects of FOSS have also come to be very important to me. I do agree, fundamentally, with the four freedoms laid out by the Free Software Foundation:

  • The freedom to run the program, for any purpose (freedom 0).
  • The freedom to study how the program works, and adapt it to your needs (freedom 1). Access to the source code is a precondition for this.
  • The freedom to redistribute copies so you can help your neighbor (freedom 2).
  • The freedom to improve the program, and release your improvements to the public, so that the whole community benefits (freedom 3). Access to the source code is a precondition for this.

And ultimately, in Michelle\'s perfect world, all software, all content, all hardware, etc. would all be free (libre). But we don\'t live in Michelle\'s perfect world, we live in this world. This broken, very imperfect, very problematic, and quite capitalist world.  And in that world, I am a realist. I am ecstatic, and do many happy dances a day, that there are people who write and support free software. I think of myself as one of them (Besides working with NOSI, I\'ve been involved in several projects over the years in varied capacities.) The number of situations where one can still argue on a functional and cost level that proprietary software is a better bet get fewer and fewer. I could easily argue that for the overwhelming majority of places you need an operating system, the free and open source alternatives are better (if you count the BSD core of the Mac OS. If you don\'t, it\'s still the majority.) There are innumerable really great free and open source desktop applications that can run on any OS, and there are more every day. And, surprisingly to me, I\'m quite happy that lots of big corporations are now really getting into free and open source software support. I think, ultimately, it\'s when big corporations want to ditch Windows on the desktop that the biggest strides will be made in Linux desktop usability and support. That\'s a tide that will really lift all the boats. Bottom line for me: free and open source software is about leveling the playing field, free access, community benefit, and community control allows this. That\'s why I got into it in the beginning, and that\'s why I\'m sticking around, and doing what I can, even though I\'ll be using my Mac (with, of course, a lot of free software applications on top.)

Continue Reading

Linux desktops?

On 24 Jun, 2008 By mpm With 12 Comments

I\'m doing a webinar on Linux Desktops next month, and it\'s making me think a lot about my own experience using Linux on the desktop, and where I think things are going. If you\'ve read this blog for a while, you\'ve heard my various sagas around using Linux on the desktop. I migrated to making it my primary desktop about a year ago. I have had varied problems, from issues of software integration, video problems, wireless issues ... The list is getting very long. And, guess what? I\'m giving up. At some point, when I\'ve saved up enough pennies, I\'m going to buy a Mac laptop again. I\'ve basically switched to using my Mac mini for just about everything except the bit of systems admin and coding I do, because it\'s just so much easier to set up things on Linux for that type of work. What happened was I felt like I was wasting too much time on things that should be easy. It should be easy to plug in a new monitor. It should be easy to get wireless, it should be easy to add a new printer. It should be easy to play a DVD. These things are far from the fault of Linux. On the Windows side the hardware manufacturers make proprietary drivers for Windows, and very few make drivers for Linux, or open source their drivers so that Linux developers can use them. On the Mac side, Apple controls the hardware, so there never is a problem with it. (And, of course, there are plenty of peripherals that don\'t work with Macs.) And then there are the proprietary codecs and DRM, that are all tied to an OS. So what does this mean for Linux on the desktop? This experience has made me think a lot about where organizations should think about using Linux, and where they should steer clear. My theory is that where Linux is going really work is in dedicated devices built from the ground up to run Linux, and used for relatively limited purposes. The eePC is a great example, as are cell phones, media players, etc. This is where Linux will shine. And, of course, there are some other situations where Linux also shines: kiosks, internet cafes, computer labs and email/web workstations. There isn\'t good reason, at this point, *not* to use Linux there. I think it can also work for the folks who perhaps use laptops as their primary machines, and don\'t do anything except email and web. And, of course, always, for developers. I probably will always have a Linux desktop around, even if it ends up being a virtual machine, for the varied (small, at this point) development projects I have going. It\'s just dead easy to use Linux for development - easier than using the Mac, even though it has BSD as it\'s basis. Where is Linux not a good idea? Most creatives and knowledge workers who are not developers. There are just too many things we need - too many cool new peripherals, we want to manipulate too many kinds of data and media, etc. And we don\'t (at least I don\'t) want to spend to much time getting all of that to work. And there still are lots of newer software and services that aren\'t coming out in Linux versions (for instance there does not exist a decent usable twitter client for Linux - gwitter is not very usable at all, and the others just don\'t work.) And, of course, designers need software that just don\'t have high enough quality open source alternatives yet. I think that for a while, at least, Linux won\'t be a good desktop option this broad group of people. Which is unfortunate, and I hope it changes. Linux has made huge usability strides in the past few years, as has open source software in general, so I think the future is still bright.

Continue Reading

NPTECH Punk

On 19 Jun, 2008 By mpm With 2 Comments

Beth, of course, suggested this, and I\'m jumping on her bandwagon. I realized, in being introduced to Edupunk, that I have been doing it for, oh, almost 20 years now. In 1989, I joined the faculty of Hampshire College (and stayed for 10 years). Hampshire\'s motto is \"Non Satis Scire\" - to know is not enough. From their website:

Some of the features that distinguish Hampshire from more traditional liberal arts colleges include student-designed academic concentrations; an active, collaborative, inquiry-based pedagogy; an interdisciplinary curriculum; and a narrative evaluation system.

Sounds a lot like Edupunk, doesn\'t it? But in the nonprofit realm, my perspective on helping nonprofit organizations with technology issues has a lot to do with client empowerment, learning based on what\'s needed at the moment, and active collaboration. I got a chance to test this out in a more orchestrated way (as opposed to the usual consultant/client interactions) when I facilitated/taught an OpenOffice.org \"untraining\" earlier this month at Google HQ in NYC (some more details are on the Google Blog.) I learned a lot. The unconference/camp model of learning about technology issues is really great, but falls a little short when dealing with a specific tool, and an audience that is mostly unfamiliar with it. So the model that I am coming up with is a combination of that model, and what I would call an \"inquiry based\" model - helping people in a more structured way come up with specific questions and problems before the event, and then use the event to collaboratively answer those questions, and solve those problems. The questions and problems are generated exactly from the needs of the participants - what do they need to do? Anyway, I do hope at some point to have a chance to do this kind of thing again. And I think it would be great to have an nptechpunk mini movement!

Continue Reading

Frackin\' Brilliant

On 17 Jun, 2008 By mpm With 4 Comments

That\'s what I said to Tompkins Spann, of Convio, when he told me last week (after I duly signed the requisite NDA) that Convio was going into the donor database business by building an app on top of Salesforce.com. Actually, I didn\'t use the Battlestar Galactica expletive, I used the one you\'re more familiar with. :-) Convio is launching the new application, built on top of the Force.com platform. It is named, as of now, Aikido. It has the kind of functionality you\'d expect from a donor database. It seems in a business sense, to be a brilliant move. Leverage the power of the open platform of Salesforce.com, and do the work that nonprofits (and consultants) have been having to do to bang Salesforce.com into shape as a donor database. And make it reasonably priced. They are starting with a \"charter\" program - a few nonprofit organizations, to iron out all of the varied issues, both technical and logistical, I imagine. This ups the ante major league for Blackbaud, for sure, as well as other long time desktop donor database providers. It may, depending on pricing, even give the open source CiviCRM a bit of competition. And it means an interesting dance for those in the nonprofit salesforce community. Of course, the proof will be in the pudding - we\'ll find out over time how this app works, and whether organizations like it. But the whole CRM field just got more interesting.

Continue Reading

Tidbits

On 11 Jun, 2008 By mpm

  • The Tides Foundation is accepting applications for the 2008 Antonio Pizzigati prize, a \$10,000 annual award for outstanding contributions to software in the public interest. The competition, judged by a panel of national leaders in public interest computing, is now entering its third year. The application deadline for this year's prize: September 1, 2008. Last year\'s winner was Barry Warsaw, lead developer of the awesome Mailman mailing list manager.
  • Ruby Sinreich is a new phenomenon! You go girl!
  • NTEN keeps surprising me with cool stuff for nonprofit tech peeps. And I\'m on the board! Check out the book club, reading Media Rules!: Mastering Today\'s Technology to Connect with and Keep Your Audience, by Brian Reich and Dan Solomon.
  • Check out the new images and videos of Android, the open source phone. And the video with the kids is very cute. Wait for this, or buy an iPhone? I might wait, depending on carriers...

Continue Reading

What is private? What is public?

On 10 Jun, 2008 By mpm With 2 Comments

{.alignright .size-full .wp-image-270 width="500" height="282"} Today, someone on the progressive exchange list asked about a tool called Rapleaf. A story about Rapleaf in Clickz (a newsletter for online marketers) says this:

Rapleaf allows you to quickly and inexpensively find out the social networking footprint of those you\'re marketing to. Just send the company your e-mail list and tell it what social networking sites those on your list are using, their demographics, the numbers of friends they have, how many widgets they\'re using, even their interests. Rapleaf digs into the usual social networking sites (Facebook, MySpace, etc.), as well as newsgroups, commerce sites (like Amazon), review sites, forums, and news groups, and even searches the general Web to find out where your people are and what they\'re doing online.

An interesting conversation ensued on the list - with some arguing that this was a problematic thing. I actually thought this could be quite useful for organizations to figure out how to allocate sparse resources in the Web 2.0 space. But that\'s not the point of this post. I realized that one of the most important things that we can do is educate the organizations we work with (as well as individuals) about privacy issues and data. When is data public? When is it private? How do we know? How can we assure privacy? It is important to understand that Rapleaf is just gathering public information on people, based on their email addresses. It is an inevitable result of our desire for social networks, as well as our desire for information to be portable (like in RSS feeds.) What\'s important is that we understand what is actually public, and what isn\'t, and how to keep what we want to be private, really private.

Continue Reading

How\'s yer CMS?

On 29 May, 2008 By mpm With 1 Comments

NTEN just released their CMS satisfaction survey. There is some great food for thought, although in some ways, the results aren\'t so surprising. Most people (67%) want an easy to use interface. Most people (57%) also want ongoing support. The single largest CMS used was Drupal, at 15%, followed by Plone and Joomla (approx 8% each.) Wordpress (which was not originally in the survey, they parsed this data out from the \"other\" category) was at 2%. Further, in looking more at the \"other\" category (which made up 29% of the CMS) there were quite a number of other FOSS CMS, including ImpressCMS, Zope, Movable Type, and Typo3. All of the free and open source options did quite well in terms of Quality, Usability, and Value. The other questions, that are really more geared toward particular vendors, for FOSS CMS are not really applicable to the CMS itself, but to the consultants or vendors who implement it - which is bound to be variable. Only two proprietary CMS systems, Antharia\'s and Ektron, had scores as good as the open source CMS.The CMS options from the big three (er, now the big two) didn\'t score as well. Anyway, the survey report, and survey data are worth looking at if you are shopping for a CMS.

Continue Reading

Blackbaud buys Kintera

On 29 May, 2008 By mpm With 2 Comments

In retrospect, this probably was inevitable. And I\'m sure that the very low stock price of Kintera\'s certainly made a buyout of it by Blackbaud easier. Today, Blackbaud announced the acquisition of Kintera. So, there is now one less nonprofit CRM vendor to choose from, and Blackbaud keeps getting bigger. But will it get better as a result? Hard to know. In Kintera, Blackbaud certainly got it\'s hands on a platform with pretty good open APIs (Allan Benamer argues they are better than Convio\'s.) Will they continue in that direction? Blackbaud\'s other recent acquisition, eTapestry, did open up their APIs recently, although they leave much to be desired. It will be interesting to watch what happens with Kintera, and especially, what happens with their APIs. And how will Convio react? And, of course there is still Salesforce.com, as well as the developing realm of free and open source options, like CiviCRM. It\'s going to get interesting, for sure. But, in all honesty, if I were part of the teams of any of those options, there would be no shaking of boots because of this merge. Update: Allan Benamer has some more information that\'s worth reading.

Continue Reading

{.post-icon .standard}

Google Health launches ... and it\'s not HIPAA compliant

On 20 May, 2008 By mpm With 1 Comments

Yesterday, the big news is that Google Health launched. Google says:

\"Google Health aims to solve an urgent need that dovetails with our overall mission of organizing patient information and making it accessible and useful. Through our health offering, our users will be empowered to collect, store, and manage their own medical records online.\"

Sounds pretty interesting, but hold on a second. Before you sign up, read the privacy policy carefully. And note: this application is not HIPAA compliant. Here\'s why. They do have a point - since they don\'t provide health services, they don\'t need to comply with HIPAA. The language (especially in this table) seems to suggest that the privacy they are providing is better than HIPAA. I\'m not so sure, and, in the end, it comes down to \"trust us\". I\'m just not so sure how far I should trust Google with my health care data. It gives me enough pause to trust them with my email.

Continue Reading

Tidbits

On 19 May, 2008 By mpm With 1 Comments

There are some really interesting tidbits of stuff out there. Here are a few: The blog at the Nonprofit Times, called \"Don\'t Tell the Donor\" has a very interesting entry on the flurry of benchmarking studies that came out recently. It\'s titled \"Benchmarking With a Warped Stick.\" It takes aim at Convio\'s recent benchmarking study. To their credit Convio asks this question on their blog: \"Should organizations like Convio, and Giving USA continue to offer these sorts of insights to the nonprofit community, or is this simply self-serving marketing fluff?\" What do you think? This is what I think: Convio should partner with neutral players to underwrite benchmarking studies that get data from a much wider sample of organizations than just their clients. Everyone (including them) gets better data. Speaking of metrics, Drew Bernard has an awesome post about how to use web analytics based on the functions of specific pages or sites. Way to go! Allan Benamer points out that Kintera is about to be delisted by NASDAQ. Blackbaud hasn\'t been doing so well, in the stock price department either. Commenters on Allan\'s post wonder about the fate of Convio\'s IPO. It does make me think a lot about this whole space, and wonder if the fast moving train of Salesforce, and the slower moving trains of the open source alternatives, are beginning to bear down on the old guard, and how they will respond to them (or not, which would spell doom.) Speaking of the responses of the old guard, eTapestry, which was bought by Blackbaud last year, is opening up it\'s API this week. Allan, in his inimitable way, points out how bad the API is. And yeah after reading the docs, I agree, it\'s bad. So is it good news, when companies open APIs that don\'t make it easy to really use? Are these attempts at \"OpenAPIWashing\" (my new term for companies that might spend more \$ promoting their APIs than actually developing them) or are they just steps along the way to finally really good, solid, usable APIs? Time will tell.

Continue Reading

It\'s been a while...

On 12 May, 2008 By mpm

As you\'ve probably noticed, I haven\'t been blogging a whole lot lately. I\'ve been pretty busy with a variety of projects. I\'ll be on hiatus for about another week, and will have some long awaited new FOSS tools, as well as other posts that have been brewing for a while.

Continue Reading

Rate your CMS!

On 23 Apr, 2008 By mpm

NTEN is doing some great work getting information about the use of different kinds of tools in the sector, and how people are using them. and how they like them. They are doing a CMS satisfaction survey, and the more info that they get, the better. So go rate your CMS!

Continue Reading

{.post-icon .standard}

Free and Open Source Tool #16: CiviCRM

On 21 Apr, 2008 By mpm

In honor of the webinar that is happening in a couple of weeks, I figured I\'d talk a bit about CiviCRM. CiviCRM is a nonprofit-focused open source tool, centered around membership, fundraising, events and such. CiviCRM was one of the first (of a now growing number) of nonprofit-focused open source tools. It originally came out of the idea of making moving eBase (the CRM based on Filemaker Pro) to the web. CiviCRM has 4 basic components: CiviContribute, CiviMail, CiviMember, and CiviEvents  - which allow you to track contacts, donations, members, send out email blasts, have event registration, etc. There is even a new case management feature in 2.0, which can be useful for organizations that need that functionality. CiviCRM is a great CRM for small-to-medium sized organizations that need CRM functionality. In order to insure email deliverability on blasts, you\'ll need to have it hosted somewhere where they are actively dealing with whitelists, etc. Otherwise, it\'s easy to host it on a generic hosting account, if you don\'t need that functionality. It can integrate with both Drupal or Joomla, and there is a new stand-alone version as well. Drupal integration is better, but there is a lot of active development going into improving the Joomla integration. CiviCRM ranked #1 in satisfaction in the recent NTEN CRM satisfaction survey.

Continue Reading

Tidbits

On 17 Apr, 2008 By mpm With 1 Comments

I guess because I\'m a blogger, I get these interesting tidbits in my mailbox. I don\'t always have a lot of time to investigate them, or figure out if they are useful, but I do like to not completely ignore the ones that look interesting. I do hope folks will comment if they know something about these, or have an opinion.

  • ReviewBasics is a collaborative editing and reviewing tool. I\'ve perused the demos, and it does look like it would be useful for reviews of things like specs, wireframes, etc.
  • Zoomgrants looks like it is trying to be some sort of one-stop-shop for grantors and grantees. As a funder, you can put up RFPs (at a cost), and people can apply directly online. It looks interesting, and potentially useful. I wonder how many foundations will go for it, though. I started a little Twitter conversation about it, and people seemed intrigued.
  • BlueAvocado is a new online blog/magazine/website specifically for folks who work in and with nonprofits. It looks very promising. I\'m looking forward to reading it.
  • Convio is getting into some interesting territory these days. They\'ve been doing some nonprofit research. They have released two studies, one on the \"wired wealthy\" and another which is a \"nonprofit benchmark index\" study - basically providing some benchmarks for organizations to measure themselves against, traffic, email newsletter click through rates, etc. It\'s actually a pretty interesting resource, and worth a read. Of course, it\'s only a study of Convio\'s clients - but it\'s interesting nonetheless.

Continue Reading

{.post-icon .standard}

Free and open source tool #15: MPower Open CRM

On 14 Apr, 2008 By mpm With 3 Comments

I am so far behind, it\'s not funny. I\'ve got to catch up. My goal is to catch up by the end of this month, so that I\'ll still be on track to make it to 100 free and open source tools by December. This post gives me the chance to finally write the very belated post on MPower I\'ve been meaning to write since I got back from NTC.

mpower.gif

MPower is not a new product. It\'s been around for quite a while, and has a solid user base. It is an enterprise-class client/server CRM, and has the kind of features you see in such packages as Blackbaud\'s Raiser\'s Edge. What\'s new about MPower is that it has very recently been released as open source. I had a great sit-down with Randy McCabe, CEO and Leo D\'Angelo (CTO) of MPower at NTC. I heard a lot about the product, and their plans, and I was impressed with their thinking, and with the direction they are heading. Their basic idea makes a lot of sense to me, and it clearly is an idea that lots of companies that release open source products are thinking: don\'t increase revenue by trying to milk as many current customers as possible (which is, frankly, the goal of many proprietary software vendors, especially those with very niche packages without much potential for growth in customer base,) broaden the number of customers out there greatly by making barrier to entry low. They expect to make up the difference in revenue that they got from licenses from services sold to a greater number of organizations that would not have been customers otherwise. Lots of open source companies (RedHat, MySQL AB, Novell, Alfresco, SugarCRM, Canonical) are doing similar things. For you purists: don\'t get all upset. Yes, it\'s a Windows product. Yes, it\'s written in .NET and C#. Yes, it requires MSSQL server. So what? It\'s open source, and it is yet another option for organizations - and it is an open source replacement for Raiser\'s Edge. How cool is that? And it\'s open source - so someone who really cares can port it to work with MySQL, etc. And, it\'s got completely open APIs. All of that said, there are a few things I hope that they consider. I hope that they decide to go with an OSI approved license (they are currently using their own, which is a modification of the Apache license. Having looked at it, it\'s a fine license, but it would help them if they used one that is known already, like the GPL, or LGPL, etc.) They also have, at this point, no community. They have a partner program, which is like a lot of partner programs - you have to be vetted, yadda yadda. Not at all in alignment with the open source ethic. They need to open their doors, make installing MPower easy (it\'s not, at this point) and set up some community functions to help grow a community around the product, which will help it grow, and really help to begin to provide the avenues for developers to get involved, and continue to help build the product. I\'ll be following MPower closely over the next months and years. I have high hopes for it. And Blackbaud may well be shaking in their boots. Here\'s some other coverage:

Continue Reading

{.post-icon .standard}

Carnival of Nonprofit Consultants

On 14 Apr, 2008 By mpm With 2 Comments

I like hosting the Carnival of Nonprofit Consultants. Mostly, because I get to read blogs by people that aren\'t on my list of feeds I read regularly. And I get to highlight the work of some of my favorite bloggers, too. Joanne Fritz asks the question that is probably on the minds of lots of folks in the nonprofit sector: what are we going to do in facing the current problematic economic climate? She suggests not to panic, and not to change course - keep steady, and keep communicating. SOX First, which is a new blog to me, and focuses on Sarbanes-Oxely compliance, asks whether nonprofits hold the ethical high ground. Their answer: they may well be losing ground. James Young, writing on Convio\'s newish blog, Connection Cafe, talks about how to find, and create, influencers. What are \"influencers?\" Read the blog entry. It\'s pretty interesting. Marketing and Fundraising Ideas tells us about how not to ask for a major gift. There is an interesting case study of the marketing of Tampax and Africa on the Cause Related Marketing Blog. And Katya tells us why Seth Godin is right about people being lazy and in a hurry, and gives us some tips on how to use that. And lastly, since I\'m hosting, I get to mention my recent post on Twitter and nonprofits. Next week, the Carnival is being hosted at A Small Change - Fundraising Blog.

Continue Reading

{.post-icon .standard}

I\'m hosting the carnival next week

On 11 Apr, 2008 By mpm

19353072_35d4135075_m.jpg (Photo by frankienose) I\'m hosting the Carnival of Nonprofit Consultants here, next week. So send in your best of the week!

Continue Reading

Twitter and Nonprofits

On 10 Apr, 2008 By mpm With 9 Comments

This actually was a post to the Progressive Exchange discussion list. I love twitter, which in some ways surprises me, and in some ways doesn\'t. It provides for me a sense of community, and a sense of what people I know and care about things I care about are talking about (in a certain realm, on other realms, not so much). I think that Twitter is, in many ways, a harbinger of the future - I think eventually, a lot of things that happen between people over the net will work a lot like twitter, even if it\'s not actually twitter - social networks carrying short snippets of people\'s thoughts, ideas and events. But right now, at this moment, twitter\'s demographic is both tiny and highly nonrepresentative of the population of the world. It is made up of people who are techically-oriented, largely affluent, and largely spend inordinate amounts of time in contact with electronic devices. We are still in innovator phase here, not even early adopters have signed on. There is no question that you will get out of twitter what you give. And, wow, yeah, you can be highly successful in twitter. And so what on earth does that []{.moz-txt-tag}really[]{.moz-txt-tag} mean? If your organization\'s mission will be greatly benefited by making connections with the twitter demographic then, yeah, twitter makes tons of sense. And organizations and movements can certainly use twitter to organize - I think that\'s a great strategy - as long as the majority of those to be organized are on twitter - which is quite a stretch for most orgs or movements. But there is no way on earth that I am going to suggest that a client of mine whose demographic is mostly women over 50 that they even spend any time on twitter in trying to accomplish their mission, or even get the word out about what they are doing. Should the communications person use twitter to connect with other nonprofit communications professionals? Heck, yeah, I could easily argue it will help them in their work. Should they spend a bit of time tying their RSS feed (if they have one) to a twitter account? Sure, why not. But should the organization as a whole put resources into a \"twitter strategy\"? Or even a social network strategy? I\'d be really hard pressed to suggest that they spend much of their meager resources on that. I know that people are doing fantastic work around the ways in which social networks can be powerful tools. And there are, for sure, some interesting case studies. And there are also some organizations for whom this makes sense, and who have the resources, and are ready to take good advantage of all of these tools, including twitter. But from my perspective, working with organizations that are sometimes having a hard time moving from a static website to a CMS, social networks in general, and twitter specifically, are a long way away. I feel like what happens all the time is that we nptechies grab onto a new technology, and the first thing we think is that we gotta get organizations using it. And people in orgs hear all this buzz about this thing or that thing, and feel hard pressed and stressed to get on the bandwagon. And I feel like we don\'t spend enough time thinking about whether or not it is appropriate - whether it makes sense, whether it really is going to benefit the mission of the organizations we work with.

Continue Reading

{.post-icon .standard}

News in open source and open standards

On 03 Apr, 2008 By mpm

Here\'s a few interesting tidbits gleaned from the net:

Continue Reading

Michelle, the consultant

On 27 Mar, 2008 By mpm

Someone mentioned to me that from what I write on my blog, she wouldn\'t know what it is that I actually do in my consulting work. I kinda thought that was surprising, but in thinking about what I write, I realize that people could get the wrong idea (or, more accurately, fail to get the right idea.) And, I guess truthfully, the blurb on my consulting site is kinda dry. Gotta work on that. So, what do I do? I think of myself in these terms: I educate, facilitate, mediate, and problem-solve. For one client, I am their technology go-to person, since they are really small, and have no tech staff. I don\'t implement much for them (although in a pinch, I\'ll set something up, or fix a specific problem.) But I help them plan their technology initiatives (a new database, a new website, etc.,) help them find the vendors that will do the work by helping them craft good RFPs. I answer all of their tech questions, and solve pretty much all of their tech problems (mostly by helping them figure out who they should call.) I\'ll be the project manager on their big new client database project, and help them think about how far to dip their toes into Web 2.0. For another client, I helped them vet vendors for their new website, taught them the difference between Joomla, Drupal and Plone, and I helped free them from a vendor who was particularly egregious in their hosting charges, among other things. (\$1200/month for an old and therefore crappy custom CMS and not much support. I. Kid. You. Not. I myself wrote a custom CMS a long time ago that would be considered crap now, so I don\'t blame them for that, but the charges???) For a third client, I helped them translate their ideas about what they wanted their website to do, to things that could actually be implemented in a CMS. I helped them vet CMS vendors, make sure the CMS that was chosen could do the complex job they were asking it to do (some were not up to the task) and am the intermediary between the current web vendor and the client, lending my expertise as needed, and helping to move the project forward. I think my clients benefit from one particular thing that I think is pretty unique. Although I am deeply experienced and knowledgeable in implementation of technology, from networks to web applications and databases (I really know how DNS works, can write a left outer join in SQL, and know the difference between REST and SOAP) since I don\'t do implementation or coding anymore, I\'m not wedded to one set of technologies. I can bridge the gap between technology vendors and clients in a way that is pretty unusual, and, honestly, that I\'m proud of. I know when a vendor knows their technology, and when they are blowing sales language at me, and might not be up to the technical task. I can evaluate previous projects based on what I see is the underlying complexity, and figure out how much experience a vendor has had with a particular set of problems. I know when a technology is really appropriate, and when it\'s not. I have other kinds of projects as well - those that use my talents somewhat differently. I do a ton of technical writing, and I have become the \"documentation facilitator\" of the OpenMRS project - it is an open source medical records system, designed to serve clinics in low-resource areas (such as sub-sarahan Africa) that treat patients with HIV/AIDS. Since we\'re starting with the developer documentation, one of the coolest things about this project is that it\'s making me learn Java and Eclipse, plus get my hands dirty with Tomcat. Yum! I love learning new stuff. I like the range of projects I do. I\'d like to do more of all of it. I\'d also love to consult with for-profit vendors who are thinking about dipping their toes into offering support for open source projects, or open sourcing their software. OK, enough of that. Now back to our regularly scheduled programming. :-)

Continue Reading

{.post-icon .standard}

Free and Open Source tool #14: SugarCRM

On 27 Mar, 2008 By mpm With 2 Comments

sugar2.jpg Since I\'ve been covering CRMs for the webinar today, I figured I\'d switch categories on my free and open source software list. So for the next few tools, I\'ll be describing CRMs (Constituent/Contact/Community/Customer Relationship Management). The first one is one that I\'ve been using recently for my consulting business - SugarCRM. Unlike CiviCRM, which is targeted to nonprofits, SugarCRM is a system targeted toward sales in for-profit companies. It has three versions: Enterprise, Professional,  and Community Edition. That\'s the one I\'ve been using. They also have a newer on-demand version (that is, software as a service, like salesforce.) I\'ve been hearing good things about SugarCRM from organizations that use it. It also gets kudos from NTEN\'s satisfaction survey (it came in third, after CiviCRM and Salesforce.) Anyway, SugarCRM is basically \"enterprise class\" CRM, and is worth a look.

Continue Reading

How to choose a CRM

On 26 Mar, 2008 By mpm

I\'ll be doing a webinar on open source CRMs tomorrow. In the process of going deep into those CRMs, I\'ve been thinking about how nonprofits might choose CRMs to begin with. Of course, all nonprofits already have a CRM (even if it is a spreadsheet) - the issue is, generally, migration to a new system, or integration with what they already have to add new features. Idealware has a great article on CRMs, and how they are different, and how you can begin to figure out what might work best. I also wrote a software choice worksheet, that can help with the process of looking at a wide variety of tools. One of the fascinating things to me is how quickly the CRM space is evolving. New open source players entering the market (more on them soon),  high satisfaction for other open source tools, and SaaS vendors throwing the doors open so that nonprofits can integrate their systems well (I\'m psyched to hear about all the new connectors, mashups and apps happening all the time.) The lesson here, I think, is that CRM, even for large organizations, is changing rapidly, and the days are numbered where systems that are expensive, proprietary, and closed rule. I\'m glad to see that. And I think that nonprofits should make sure when they choose to look at the wide range of options, some of which are very cost-effective, and open. And you don\'t need to feel overwhelmed by vendor sales-talk - it\'s possible to get things translated to language you understand. After the webinar, I\'ll be putting the slides up in varied places, and NTEN will have a recording of the webinar.

Continue Reading

Post-NTC Ramblings

On 24 Mar, 2008 By mpm With 1 Comments

It was a great week. I got to see lots of people that I only see once a year (or even less often,) I got to meet a lot of new people, I had interesting and deep conversations, and I got to hear a lot about what people are thinking about the nonprofit technology field. I\'ve got several posts on tap about specific aspects of the conference, or specific products and such that I\'d come across during my time at NTC last week, so stick around. I want to give huge props to Holly and all of the NTEN crew for putting on a fabulous conference. It was incredibly well organized and smooth running, the food and snacks were great, and the parties rocked. The conference was rich with great speakers and content, and it\'s great to see how far things have come. And, as a member of the NTEN board, I\'m really proud of what\'s been accomplished. There are, of course, some tweaks we can make to make sure that everyone really can gain value from NTC, and we can sustain the richness that it contains. And I\'m already looking forward to NTC in San Francisco in 2009!

Continue Reading

News from NTC \'08

On 20 Mar, 2008 By mpm With 7 Comments

I\'ve been having a great time here at NTC - running into lots of folks I\'m happy to see again, and meet, learning about new things, being involved.  I don\'t have lots of leisure to go into detail about what\'s happening here, but I thought I\'d highlight a few things, and then when I get back home, and have time for it all to digest, I\'ll write in much more detail.

I\'m looking forward to having time to digest all that has happened here.

Continue Reading

On my way to NTC

On 17 Mar, 2008 By mpm

Tomorrow, I step off the plane in New Orleans, to go to the Nonprofit Technology Conference, and see folks I only get to see once a year, meet new people, chat and hang out with people I\'ve been emailing/blog commenting/twittering, etc. I\'m looking forward to it. I\'m on the Evaluating Open Source panel,  with Laura Quinn and Catherine Lane, which should be great. I\'m also holding the consultant spot on the panel \"Changing your CEO from barrier to partner\" with Marnie Webb, David Geilhufe, and Steve Heye. Lots of wonderful folks, should be great panels. I\'m in a little bit different of a place this year, than last. Last year I was just re-emerging from having taken a break to go to seminary, and not quite exactly sure what I was doing. This year, I\'m much more clear about the directions my work is going in, and what I\'m looking for at NTC. And of course, there is Penguin Day!! I\'m excited to be doing it again, in a new city, with some great partners, including Aspiration, of course, Joomla, PICnet, and the Chicago Technology Cooperative. I think it will be a great time. (And I hope to bring home a penguin!) So, if you want to catch up with me, email, and I\'ll send you my phone # so you can call or text me, or twitter me.

Continue Reading

If it\'s good enough for the Navy ...

On 13 Mar, 2008 By mpm With 1 Comments

In a surprising move, the US Navy will stop buying proprietary hardware and software, and only buy open systems.

["The days of proprietary technology must come to an end," he said. "We will no longer accept systems that couple hardware, software and data." ]{.storybody}

Basically, it seems the motivation is that open systems allow them to upgrade their capabilities rapidly, and they need to be able to share data freely. This could be a watershed moment. The process of governments in Europe starting to shift to open source software, and software that used open standards, rapidly increased the use of FOSS in Europe. It could happen here.

Continue Reading

Reflection and Evaluation

On 10 Mar, 2008 By mpm With 2 Comments

Michele Martin, one of my fave bloggers, has a great post today on Reflective Practice. Both reflective practice - that is the process of reflecting on what you do, and how you do it, as well as conscious, deliberate evaluation of projects, are things that are not very common in our field, nor things that are valued or encouraged. In many ways, we are focused on solving technology problems, or completing projects.  But I have really come to believe that the way that we work with people is as important as the \"final\" outcome. We might be able to build the most wizz-bang amazing website ever (in a technological sense) but if we haven\'t really thought about how we moved through the project, never evaluated how the project really went, and didn\'t learn from the process, in the end, the project wasn\'t the success it seemed to be. In fact, it\'s amazing how much we can learn from projects that might be considered failures by technological criteria. In the last few months, I was involved in helping three organizations choose vendors for varied technology projects, and in the course of that time, I talked with almost a dozen technology vendors of one type or another. One question I asked all of them was about whether they had a process of reflection and evaluation of their work, as it was going on, and when the project was coming to a close. Unfortunately, none of them had an answer to that question. That is something I would love to see change.

Continue Reading

Frustrations

On 09 Mar, 2008 By mpm

As some of you who follow me on Twitter know, I ran into frustrations a few days ago with WPA. In Kubuntu, the distribution of Ubuntu I had installed, the WPA-enabled Network Manager isn\'t installed by default (or at least it seemed not to have been installed when I did it - could have been my fault.) I knew that I should do it at some point, but I hadn\'t encountered a WPA network until last week, so I hadn\'t bothered. Needless to say, I\'m doing that right now. But what I realized was that the whole WPA thing with my laptop added to the pile of \"little problems I haven\'t solved yet.\" Now, of course, as a techie, and someone with a home network, and multiple computers, and varied projects, there is always a list like this. But I\'ve come to realize that now that I use Linux as my primary desktop, this list has grown much, much larger than it ever has been.

  • After spending close to five hours on the X windows/driver problem I vented about last week. I gave up. I attached the nice brand-spanking new monitor to my Mac Mini, and have been quite enjoying using it. Needless to say, I did absolutely nothing to get it to work. Plugged it in, and it just worked.
  • It took me a couple of frustrating hours or so to configure samba (editing the samba.conf file and testing) so that I could share my home directory, with music and video, with my other computers, and share my printer. Of course with my mac, I opened up the system preferences, checked a button, and, voila! Directories were shared.
  • I have outstanding issues or decisions to make with my kernel not seeing all the memory I\'ve given it (therefore requiring a recompile, which I have been postponing for weeks) sound, a webcam, a scanner, and accounting software. And there were several problems I never solved - including syncing calendars and addressbooks, finding a good time tracker. The problems I \"solved\" by offloading the functions onto the web.

There are several issues here, of course. First, although I\'ve used Linux on the server for so many years, so I\'m used to getting things done via the command line, my primary desktop was a Mac for 20 years, so I am GUI spoiled. So desktop functions (as opposed to server functions) that some people probably find trivially easy to do with the command line, I\'m looking for a good GUI. Also, having used a Mac for so long, I\'m also \"it just works\" spoiled. In fact, what\'s funny is that things that do in fact \"just work\" with Linux almost surprise me. And, as Dustin pointed out in the comments to my venting post earlier, a lot of this is not the fault of open source desktop software developers. Hardware vendors don\'t release drivers for Linux, or if they do, they remain proprietary. This does, for sure, hobble the usability of Linux on the desktop. Apple has the luxury of a hardware monopoly, so of course things are more likely to \"just work.\" And, of course, there has been a lot of resources and money poured into server software for Linux, but not as much for tools for the desktop. This is my dilemma. I am committed to the ethos of free software. And I\'ve talked about how the means and the ends are the same - so it\'s important to me to use open source tools. But I also have to get work done for my clients. And I have to eat, too. Adding extra hours to the week dealing with technology problems are hours I don\'t spend working with clients. (I estimate that 2-3 additional hours/week are spent just because I use Linux on the desktop.) The WPA fiasco a few days ago fell exactly at the moment when some really important work needed to get done for a client - so it sometimes hampers my ability to get things done. I do demand a lot of my system. I\'ve got tons of peripherals, I\'m constantly changing and modifying things - I\'m a power user. If pretty much all I did was documents, email and the web, like many people, I wouldn\'t be having these issues. I guess I\'m looking to find the right balance, being able to use Linux on the desktop, and actually not feel too bogged down in problems I need to solve. But I\'m not there yet. Not only have I offloaded functions to the web, I\'m beginning to offload some things to my Mac again (like scanning.) It\'s easier for me to think about spending minutes rather than hours getting things to work.

Continue Reading

{.post-icon .standard}

Free and Open Source Tool #13: Flock

On 09 Mar, 2008 By mpm With 2 Comments

I\'m running behind, so I need to catch up in the next week or so. I\'m still on internet clients, believe it or not, and still have some to go. Today, I\'m talking about Flock. Flock is the \"social browser.\" For bookmarking, it uses your del.icio.us account. It can bring in your photos (and the photos of your contacts) from flickr. It can keep track of your Twitter friends, etc. You can also use it as a blog editor, which I am going to start trying out. It\'s amazingly good. I\'d tried it a couple of years ago, and it was buggy, crashed, and seemed like something that was a great idea, but not realized. Now, it\'s realized. It\'s really quite nice. It\'s based on Mozilla Firefox, and apparently the developers of Flock contribute a fair bit back to the Firefox codebase.

Update: I\'ve been using it now as my default browser for the last day or so, and it is really growing on me.

Continue Reading

{.post-icon .standard}

Talking at the Politics Online Conference

On 03 Mar, 2008 By mpm

Somehow, given that tomorrow is such a big day in the political realm, it seems quite appropriate that I\'m headed down to DC to give a talk at the Politics Online Conference. It\'s a conference that is an event of the Institute for Politics, Democracy and the Internet. I\'ve never been - online politics has never really been my focus. But I\'m quite looking forward to it. I\'m giving a talk on the panel on Open Sourced Advocacy, where I\'ll be speaking with my colleagues Ryan Ozimek (of PICnet) and Jo Lee (of CitizenSpeak), as well as Michael Haggerty, of Trellon, and Alan Rosenblatt, of the Center for American Progress. I\'m looking forward to talking about free and open source software to this audience.

Continue Reading

{.post-icon .standard}

Why sometimes eating your own dogfood makes you want to throw up!

On 29 Feb, 2008 By mpm With 8 Comments

OK, so we all know that I have been eating my own dogfood (that is, using Linux on the desktop) for a while now. I even decided not to buy Leopard for my Mac Mini. And, for the most part, I\'ve been happy. I\'ve been able to do everything I need to do, and do it well. But there have been a few snags along this road, and I hit a very big one yesterday. I got this brand new, wonderful LCD monitor - 22\", high contrast, 1680x1050 resolution - I was happy. I thought I\'d be a pig in sh*t - I have been living with an ancient, ancient 15\" LCD with dying pixels for a while. But Nooooooooo. No happiness for me. None. I spent 3 hours struggling with the Nvidia drivers (that\'s the on-board video that my motherboard has) and my xorg configurations still don\'t work. (X Windows and xorg - the current software implementation of X Windows - is the way that Linux displays the graphical user interface.) Every combination of a new version of xorg.conf leads me down a garden path to nowhere. I downloaded the brand-spanking new nvidia drivers, so that I\'d be ready to deal with such a high resolution. No go. At this point, I still have to futz with the configs every time I start up, and it still isn\'t right. I\'ll send off queries to the right mailing lists and forums, and probably eventually get it all worked out. But plugging in a new monitor just should not be this hard. X Windows has always been the bane of my existence. I really have come to think that xorg has it in for me. My refrain about it has always been \"I hatesssss xorg, I hatesss it.\" Someone in an IRC channel last night who was trying to help me as I tore my hair out said \"why does xorg suck so bad?\" With all of the amazing examples of really great free and open source software, here is an example of one that just isn\'t what it should be.

Continue Reading

{.post-icon .standard}

Free and open source tool #12: Miro

On 26 Feb, 2008 By mpm

miro.jpg Miro used to be called \"Democracy Player\". Miro is basically a video player, which can recognize RSS feeds, and automatically download videos. There are channels for everything. PBS has quite a number, as do various and sundry video podcasters. I get Democracy Now, ABC politics, the Webb Alert (a daily geek news headlines show,) Bill Moyers Journal, and lots of others. It can download videos via Bittorrent as well. You can search YouTube, Google video or about 10 other video sites, and make those searches a new channel. It\'s a pretty amazing tool. And it makes disseminating your organizational videos easy as well. It\'s cross-platform (available on all platforms) and works really well (the old player was a bit buggy, but those have been really smoothed out, of late.) It is, I imagine, what the future of television will be.

Continue Reading

No more custom CMS!

On 18 Feb, 2008 By mpm With 10 Comments

This is a rant. And it is a rant on behalf of the hundreds (thousands?) of nonprofit organizations whose website is stuck behind a custom CMS - one that was written by some web development shop or another, and migration off of that custom CMS is going to be a nightmare. As the author of a custom CMS (it did have the advantage that it was released as open source, but it never caught on, so it still counts as custom) I know what it is like to put my heart and soul (and time) into a CMS, and want my clients to get what they want. I wrote that CMS back before there were any really good open source ones, like most of the custom CMS out there. But, that was then, and this is now. There are quite a number of really good CMS systems (both open source and proprietary - I\'d say there are a good solid dozen) that have large user bases, many developers and vendors who implement them, and their are lots of new modules and functionality being added every day. There is absolutely no way that one single web development shop can provide a CMS solution that is better in quality or functionality than what is available out there right now. In fact, even if you just focus on the \"big three\" open source CMS - Drupal, Joomla and Plone, 85% of nonprofits will likely have their needs fully met. The other 15% might want or need a more specialized CMS (like OpenACS, or a proprietary one,) or might need some modules developed for them. Most custom CMS that I\'ve seen lately are sorely lacking in features and/or usability, in comparison to what\'s out there, and available. Of course, one could argue that migration off of one of the more popular CMS to another one is difficult - as difficult as migration off of a custom CMS. This isn\'t the case for a couple of reasons: 1) The more popular these CMS get, the more people need migration help, and the more resources are available for them (just google \"joomla drupal migration\".) 2) More people than just the person who set the CMS up can help do the migration. Unfortunately, relationships with vendors go bad, and being stuck with data in a custom CMS makes migration away from a bad relationship that much harder. This is the moment for nonprofits to stop accepting proposals with custom CMS, and to make it clear in the RFP that a custom CMS will not be acceptable. It\'s also the time for web developers to let their babies go, and start building their business on a well-developed CMS. (Hint: I hear there is way more Drupal demand than supply of expertise.)

Continue Reading

{.post-icon .standard}

Free and Open Source Tool #11: Azureus

On 15 Feb, 2008 By mpm

Azureus (now called Azureus Vuze) is the best bittorrent client I have ever used. It\'s quite amazing. It\'s got a lot under the hood. Way more than I could even talk about intelligently. But that is great - if you know your stuff, you can get a lot of performance out of Azureus. Bittorrent is a bit of an arcane art (and, of course, getting a bit of a bad rep, since it\'s the major avenue for P2P pirating.) It is cross-platform and written in Java (and, I think, shows off the strength of the Java framework.)  People have written all sorts of cool plug-ins for it. The next version of Azureus, called Vuze, which I haven\'t yet used, looks like it incorporates a media player and channels and such. Basically, becoming a serious competitor for Miro, which I\'ll talk about in the next post. This isn\'t really such a useful tool for most organizations, although having a bittorent client around for downloading Linux ISO images is really useful, and on breaks, you can watch the occasional episode of the Daily Show... (just kidding.)

Continue Reading

Tidbits

On 14 Feb, 2008 By mpm With 1 Comments

These are tidbits of things I\'ve gotten recently from vendors, or gotten via feeds or twitter.

  • Kintera opens a Developers Challenge.  Developers who code solutions that integrate with Kintera using their open API platform, Connect, can win \$15,000 or \$5,000 (not the \$25 K their big logo seems to suggest - that\'s just the total they will award.)  But first, of course, you must be \"verified\" as a Kintera Connect partner. Sigh. When will people learn that to be open, you need to really be open?
  • Click and Pledge, a company that does SaaS for nonprofits, released a new product, called \"Trio\". Trio is an integration of SugarCRM, Joomla, and a credit card payment system. This is not only cool from the perspective of the integration of two great open source web apps, but it also is a very interesting business model. Setup of all three has a one time fee. Then, all monthly hosting fees are waved if more than a certain amount of money is transacted using the payment system. The hosting costs, if you don\'t qualify for free hosting, are pretty reasonable.
  • Matt Asay, blogger of all things in open source biz models, thinks Google Code may have overtaken Sourceforge. He asks: \"Will the world notice a diminished Sourceforge? I think so, but maybe I\'m just nostalgic.\" Um, Matt, Sourceforge has been basically irrelevant for years, since people started moving their projects off of that platform, and onto their own platforms. New projects seem to crop up more on Google Code than on SF now a days.
  • Mozilla Labs announces the winners of their Extend Firefox2 contest - the best Firefox add-ons. Some definitely cool stuff I\'ll have to have a look at.

Continue Reading

{.post-icon .standard}

An interesting call from danah boyd

On 08 Feb, 2008 By mpm With 1 Comments

Those of you steeped deeply in Web 2.0 know danah boyd. She\'s a brilliant academic who studies social networks. A couple of days ago, she made a call on her blog for academics to stop publishing articles in closed journals.

On one hand, I\'m excited to announce that my article \"Facebook\'s Privacy Trainwreck: Exposure, Invasion, and Social Convergence\" has been published in Convergence 14(1) (special issue edited by Henry Jenkins and Mark Deuze). On the other hand, I\'m deeply depressed because I know that most of you will never read it. It is not because you aren\'t interested (although many of you might not be), but because Sage is one of those archaic academic publishers who had decided to lock down its authors and their content behind heavy iron walls. Even if you read an early draft of my article in essay form, you\'ll probably never get to read the cleaned up version. Nor will you get to see the cool articles on alternate reality gaming, crowd-sourcing, convergent mobile media, and video game modding that are also in this issue. That\'s super depressing. I agreed to publish my piece at Sage for complicated reasons, but... I vow that this is the last article that I will publish to which the public cannot get access. I am boycotting locked-down journals and I\'d like to ask other academics to do the same.

It\'s really worth a read. If I were still an academic, I\'d totally take her up on it. She is also realistic - she describes in detail in the post what people can do, whether they are tenured or not. I agree with her that open journals are the future. She says, at the end of her long entry:

Making systemic change like this is hard and it will require every invested party to stand up for what they know is right and chip away at the old system. I don\'t have tenure (and at this rate, no one will ever let me). I am a young punk scholar and I strongly believe that we have a responsibility to stand up for what\'s right. Open-access is right. Heavy metal gates and expensive gatekeepers isn\'t. It\'s time for change to happen! To all of the academics out there, I beg you to help me make this change reality. Let\'s stop being silenced by academic publishers.

Continue Reading

{.post-icon .standard}

Free and open source tool #10: Filezilla

On 07 Feb, 2008 By mpm

I decided that most of the tools I\'ve been talking about so far (except WordPress and Joomla) are internet clients for one type of protocol or another. I figured I\'d keep on this track for a while - there\'s lots to talk about. Next up, Filezilla. filezilla.png{width="303" height="193"} I\'ve used more FTP clients in my time than I can even begin to remember, from command-line ftp, to WS-FTP, and lots and lots of others (I have this memory of a really old, clunky FTP client for Mac OS 7 or something that I was using a lot, when all filesharing was via FTP.) Sometimes, I wish I had something like Transmit for Linux - which is a Mac OS X client, and the slickest, most feature rich FTP client on the planet (but, sadly, not free in any sense of the word.) No, it\'s not slick, but Filezilla does the job nicely. It has shortcuts for all of your servers, has nice drag and drop for moving files around, allows you to do all sorts of remote actions on files, etc. It handles FTP, SFTP and FTP over SSL/TLS.  I use it all the time, and I really like it. I do think that it\'s probably the best GUI FOSS ftp client for Linux there is.  Oh, and there is a Windows version, too.

Continue Reading

Data Portability update

On 06 Feb, 2008 By mpm

If you\'re not so connected either to the \"twitterverse\" or the web industry, you probably haven\'t heard a lot about the buzz that is currently happening around the issue of data portability, and the dataportability.org organization and effort. I figured, since I\'ve been getting a bit involved in the community, I\'d give a bit of a summary of what\'s going on, and what will possibly come from this effort. Dataportability.org - the organization, has gotten a lot of press in the tech industry lately because some very big players recently joined. These include Google, Facebook, Microsoft, and many others. So first, what is data portability? Basically, it means that the data that you put into social networking sites, like profiles, social graph (those who you are connected to,) media, etc. are *yours* to do whatever you want with. In addition, they are portable - you can move your data from place to place. And you have control over who can see what. There is a good blog article, which, in some regards, might be seen as a criticism of the dataportability.org group, but which, to my mind, actually defines quite well what I\'ve thought data portability means.  He talks about data \"accessibility\", \"visibility\", \"removal\" and \"ownership\" - all things that, to my mind, are components of data portability. I\'m involved in the evangelism action group. So, I\'m evangelizing. I\'ll be doing an entry soon, sort of \"how social networks could use open standards 101.\" I think as nonprofit organizations begin to work more and more using Web 2.0 tools, they need to understand the implications of what they do, and demand that the tools use open standards.

Continue Reading

{.post-icon .standard}

Free and open source tool #9 : Pidgin

On 05 Feb, 2008 By mpm With 1 Comments

While I\'m on the subject of chat, I figured I could talk about Pidgin. Pidgin is a multiprotocol IM (Instant Messenger) client. It can handle quite the long list of chat protocols: AIM, Bonjour, Gadu-Gadu, Google Talk, Groupwise, ICQ, IRC, MSN, MySpaceIM, QQ, SILC , SIMPLE, Sametime, XMPP, Yahoo!, Zephyr. A number of these I\'ve never heard of. I don\'t use it for IRC (see last post,) but I do use it for AIM, GTalk, MSN, ICQ and Yahoo messenger (yes, I have accounts using all of those protocols. Should I hunt up Zephyr?) Pidgin is available for Windows or Linux. It used to be called GAIM. The engine underneath Pidgin is called libpurple - which is also underneath the FOSS IRC Client Adium, for Mac OS X.  (Adium is what I used when I was on a Mac desktop.) Pidgin is great software. It\'s the best FOSS IM client I\'ve used so far (and I\'ve used quite a few.)  It\'s got great plug-ins, too.

Continue Reading

Free and open source tool #8:XChat

On 05 Feb, 2008 By mpm With 1 Comments

This is, really a post both about a tool (XChat) and about IRC (Internet Relay Chat.) XChat is one of quite a few IRC clients. XChat is available for both Windows and Linux. There is a port of XChat called XChat-Aqua, that works natively on Mac OS X. IRC is an incredibly useful tool. It is basically group synchronous chat. It is a tool which is used predominantly in the open source world, for developers and users of open source projects to talk to one another, and get support. I use IRC every day. Quasi-social, quasi-professional. (like right now, on the Linuxchix IRC channel, we are discussing elections, HFS+ filesystems and terabyte switches.) Of the IRC clients, I like XChat the best, although I\'ve tried quite a number. The interface is easy to learn, very clearly laid out, and there are lots of options. It\'s also scriptable. I know that for people who work in real offices, with real other people, IRC is a difficult tool to use - because it takes you away from the environment you are in, and makes it hard to be a part of multiple conversations. On the other hand, a lot of people are IMing, and tweeting, etc. I like it because I can get fast technical help, and since I don\'t have many people around me most days, it does provide a bit of a social atmosphere. Like the water cooler to go to when you\'re taking a break. And if you want to find me on IRC, go to either the Linuxchix IRC server or to irc.freenode.net, #nosi and #nptech.

Continue Reading

New tools

On 05 Feb, 2008 By mpm With 1 Comments

Last week, I spent too much time watching demos from DEMO, which is this twice yearly event that showcases the most bleeding edge internet technology. It looked quite like it might be a fun event to be at, but the hefty \$3000 price tag for admission wasn\'t something that I could possibly stomach, so I sat in front of my laptop, watching people describe new tools. There weren\'t a whole lot of new tools that looked especially useful in the short term for nonprofits, but I\'ll highlight a few that I think might be.

  • Of course, there is a lot of buzz about Sprout.  Sproutbuilder is this amazing drag-and-drop widget maker you\'ve just gotta try (invite here - just start, and it will prompt you to create an account when you want to save your sprout.)  Carnet Williams, of nptech fame, did a demo of Sprout at Demo.
  • Another really amazing tool is Blist. Think of Blist as a really interesting combination of really pretty Excel, with some very cool features thrown in from FileMakerPro, in a sweet looking interface, that works in any web browser. It\'s being billed as the \"easiest database\" and I pretty much have to agree.  Have a look at the demo. It\'s still in \"private\" beta (ask for an invite, you\'ll probably get one - I did.) There are a lot of features that are unfinished, but what\'s done is polished, and pretty smooth. You can share these Blists. So this is definitely a tool to watch.
  • good2gether is an interesting concept. Watch the demo. Basically, they partner with localized web media outlets, and provide widgets that contain information about local nonprofits that are connected to content. So if the article is about a fire, you might see on the widget information about the American Red Cross, or other local nonprofits that address the needs of victims of fire. Corporations sponsor the widget, so there is a brand showing. As a nonprofit, what you would do is set up a profile, and I imagine tags or keywords would indicate where your link would show up.
  • Seesmic is also getting a lot of buzz, certainly in the \"twitterverse,\" but also elsewhere. Seesmic is a video conversation site. It\'s an interesting concept. I think like a lot of Web 2.0 stuff, I think it will take a while to figure out how nonprofits can use it. I am, of course, waiting for Beth to tell us.

There were a bunch of other tools, and I look forward to seeing which of them emerges to become more mainstream.

Continue Reading

{.post-icon .standard}

Free and open source tool #7: Firefox

On 31 Jan, 2008 By mpm With 1 Comments

This almost feels like cheating, talking about Firefox. Firefox, is, of course, the FOSS application that is on more desktops than all the others, combined. Although among browsers, its market share is still way behind IE - but that\'s mostly, of course, because IE is the default browser for every windows machine. Firefox is arguably the best web browser there is. It is certainly better and more secure than IE. What makes it even more powerful, is that there are tons of add-ons that make it even better. Right now, I\'ve got AdBlock Plus, Greasemonkey (which is a scripting platform that allows for lots of other interesting addons,) Tor (allows for anonymous browsing,) a bunch of google toolbars, some great web developer tools, etc. And, further, because it\'s open source, there are some great spin-offs, that I\'ll talk about later. And, I can\'t really talk about Firefox without mentioning Iceweasel, Debian\'s rebranded browser, based on Firefox. (And yes,  it would be cheating to make that a separate blog entry!) Because the Firefox artwork is proprietary (and therefore a violation of Debian\'s free software guidelines, the browser was rebranded.

Continue Reading

{.post-icon .standard}

The search for good web conferencing, take 2

On 31 Jan, 2008 By mpm With 4 Comments

Back in August, I did a review of web conferencing tools, with a decidedly unusual slant - as a presenter, I had to be able to share my Linux desktop. It was, perhaps an odd perspective, but in any event, I figured it was time to revisit this, and review what I\'ve found. Earlier, I\'d found that the only tools that would work with presenters using Linux were ReadyTalk and WebHuddle. ReadyTalk is proprietary and not free (as in beer). WebHuddle is free, and open source. There does seem to be an active (but small) user community. There is, however, only one developer, and there hasn\'t been a release in a while, so it\'s unclear how long-term viable WebHuddle is. I had been encouraged to look again at Yugma, because they said that Linux desktop sharing would happen before the end of last year. Well, it seems that it still is \"coming soon.\" But interestingly, Yugma is now integrated with Skype, which totally changes the whole audio on a separate channel issue. It does mean that everyone who is involved in a webinar needs to install Skype - but that seems to be a minor issue, to my mind. But, you can\'t use the Skype edition on Linux yet, either. I went back to look at DimDim - and lo and behold - DimDim went GPL! They now have a community edition, and there seems to be an active community of users. In addition, DimDim has an integration with Moodle (PDF)! And also an integration with SugarCRM. Now things are getting interesting. Perhaps if DimDim were also to release a version that integrated with Skype ... I can\'t figure out from looking around their site whether it is cross-platform enough to share desktops, but I signed up for the beta, so I guess I\'ll test it out. In doing a bit more research (which I guess I hadn\' t done this summer) it turns out that Webex seems to allow desktop sharing with Linux. I\'m hoping to test it out soon, as well. This is what I want: The community edition of DimDim that integrates with an open source VOIP system and Moodle. That can share my Linux desktop. That would be the holy grail. But at least it does appear that there might be increasing numbers of options out there for the likes of me.

Continue Reading

{.post-icon .standard}

Free and open source tool #6: Joomla!

On 29 Jan, 2008 By mpm With 5 Comments

I don\'t exactly know where the exclamation point came from, but if you want a scarily easy CMS to install, Joomla is a place to start. Like any powerful CMS, though, there is quite the learning curve in order to get a site up and running. But at least the first technical hurdle to jump over is a small one. Joomla is growing fast. They just released version 1.5, which, I must say, rocks. I\'ll be migrating my main consulting site over to it quite soon. They have an amazing user community, and there are places to get amazing themes. I\'ll mention the other FOSS CMS systems in other posts. I\'ll have to admit to my preference for Joomla, although Drupal is growing on me more and more. I haven\'t spend enough time with Plone to really get a feel for it. Joomla does have an interesting history - it was the fork of a project (called Mambo). Mambo is way less popular than Joomla at this point. If you want to try and get a small website going for your organization - Joomla is a good place to start. It installs easily on generic virtual hosts, and has a very sweet, eye-candy full admin interface. It will take some time and effort to get a site up, but that\'s standard for any website. It will probably take you less time than a generic HTML site will.

Continue Reading

What I\'m up to these days

On 29 Jan, 2008 By mpm

As you might have noticed, my blogging has diminished a bit. It\'s because, basically, I am about as busy as I\'ve been in a very long time. It feels quite good, actually. And I have lots of thoughts about what I\'m doing. I\'m spending about 1/2 of my time being Coordinator of NOSI. As you\'ve probably seen, we\'ve been pretty busy lately. Soon, we\'ll be updating the primer, releasing a report on our survey, starting a training and consulting program, doing a webinar series ... phew! Lots happening! And, I\'m learning a lot about what it\'s like to be a leader of a small, struggling nonprofit with big ideas. The other 1/2 of my time I\'m doing consulting work, focusing exclusively on helping organizations without a lot of technical expertise navigate their way through the maze of creating and getting through technology projects. I love this work. I have some great clients, and I feel like I get to be an educator - I spend lots of time educating my clients on the ins and outs of the varied technology issues presented (and what is, and is not a technology issue.) I get to use my expertise in web application development, but not have to do any web application development (which feels to me a lot like having my cake and eating it too.) And, I think I also get to educate the varied vendors and developers I\'ve been in contact with. Educate them about the clients needs, and, to some extent, hear about, and share best practices in doing this kind of work. And, I get to be agnostic. Yes, indeed, for some clients, and some situations, there are appropriate proprietary solutions. And I\'ve come to understand what I value about some development shops, and what I don\'t value about others. I like proposals that focus more on the project, and show clear understanding of the project. Pretty presentation with no content is useless. I am pretty instantly aware of when the person I\'m talking with knows what they are doing. I\'m made comfortable by folks who speak what I think of as a good mix of development-speak and normal language. I\'m turned off instantly by sales-speak. I appreciate shops that I know are giving back to the community, and that are known quantities in the nptech realm. I hate to be pushed. I am beginning to get a really solid understanding of what it takes for nonprofits of all sizes to navigate the technology waters. What\'s great is that although it\'s true that there are sharks in the water, there are also lots of great dolphins to swim with, and I\'m happy to be helping organizations find them.

Continue Reading

{.post-icon .standard}

Free and open source tool #5: WordPress

On 24 Jan, 2008 By mpm With 1 Comments

It seems like a good day to talk about WordPress. Why? Automattic, the makers of WordPress, and WordPress.com, just got a big chunk of \"series B\" funding. (Not being much of a capitalist, I don\'t really know what \"series B\" funding means, but I\'m imagining it\'s a very good thing.) Here is yet another amazing free and open source tool getting a lot of good attention. Anyway, WordPress is a blogging tool (in fact, the one that runs this blog.) It is  a great blogging tool. It is another of those open source software applications that \"just works.\" Installation of WordPress is scarily easy. WordPress is expandable with tons of plugins. The best one, by the way, is Akismet, which is also made by Automattic. It basically eliminates comment spam, which, as you probably know, is the bane of bloggers everywhere. Because WordPress is so easy to use, people have twisted and turned it to make regular websites. I think this is generally a Bad Idea, since there are so many easy, good CMS tools out there (I\'ll be naming a few in this series.) But if your organization decides to blog, and you want to make it easy on yourself, install Wordpress on your hosting account, or go to WordPress.com and set up a free blog. I doubt you\'ll look back.

Continue Reading

Free and open source tools #1 - #100

On 17 Jan, 2008 By mpm

I just though I\'d take a brief pause to explain my criteria for these 100 tools that I\'ll be covering this year. All of the tools I will cover are tools that:

  • I use every week, perhaps less often, and for a few I will have at least installed and tested out.
  • Have an active user and developer community
  • I know I can get my questions answered from the community
  • are good enough so that you can get real work done using the tool (in fact, under most circumstances, you could do mission-critical work with it, if your mission called for it.)
  • You don\'t have to code to do what should be basic tasks using the tool (for instance, this eliminates a good time tracking program, which at some point I might blog about, but that you have to learn scheme[1] to get customized reports with any complexity. )

I\'ll describe what you\'ll give up with these tools (if anything) compared to their popular proprietary counterparts. These aren\'t half-baked, buggy tools that are not ready for everyday use in organizations. [1] Scheme is an obscure programming language that most Computer Science students learn, but almost no one else does, and almost no one produces production code in scheme.

Continue Reading

Free and open source tool #4: GIMP

On 17 Jan, 2008 By mpm With 3 Comments

GIMP stands for Gnu Image Manipulation Program. I\'ve come to depend on it, first because I couldn\'t justify paying for a Photoshop upgrade when I moved to an Intel Mac. Now, it\'s one of a very few choices that work on Linux - and it\'s the best by far. GIMP is a very full-featured image manipulation program. Just about all of what Photoshop does, it does. I\'m not a designer or photographic expert, but it\'s a pretty amazing program. If you want good info, there is a great book by a fellow Linuxchix, called \"Beginning GIMP\" by Akkana Peck. If you are a serious designer, GIMP has its drawbacks, specifically it\'s lack of CMYK and Pantone color spaces, which, I understand, is pretty much a requirement for serious printing (but who prints, nowadays? Just joking.)  There are some other things that GIMP lacks that Photoshop has, but 90% of users probably won\'t notice. I would say if there is one really major complaint about GIMP, and it\'s one that I harbor, is that the UI, well, leaves much to be desired. It\'s not just that it doesn\'t look like Photoshop (you can check out gimpshop - it has a Photoshop UI on top of the GIMP libraries.)  It just isn\'t intuitive to use (Photoshop isn\'t either, really.) I think because of this, GIMP is missing out on the chance to become a much more popular program. If you\'re a geek, you are used to getting to know new UIs, and putting up controls that are difficult to understand or get used to. But if you\'re not a geek, GIMPs UI is a major hurdle.  And if you\'ve been used to the Photoshop workflow, it will take a lot to get used to the very different GIMP workflow. Oh, and one more thing. The name has to go.

Continue Reading

This week in FOSS

On 17 Jan, 2008 By mpm

  • Sun Microsystems buys MySQL AB for \$1 Billion (yes, that\'s ONE BILLION DOLLARS)
  • Acquia (a Drupal company) gets a large chunk of change (no, actually that was last month in FOSS, but it\'s part of the picture.)
  • OpenAds, an open source ad server (very cool) just got tons of \$ in financing.
  • A company that provides services for Ruby on Rails got a bunch of funding.

So what does this mean for you, o struggling nonprofit organization? Open source is becoming mainstream, and people (that is, people with money) are starting to throw big bucks in the direction of open source projects, and companies that provide services for open source projects. This is going to make these projects better, and make support for them more available. Because these applications are not proprietary, and anyone can get their hands on them, and install and use them, this means that nonprofits get the benefit. Because of the nature of open source, more money in the open source ecosystem is a good thing, and it is my belief that unlike \"voodoo economics\" this will actually be a tide that lifts all boats.

Continue Reading

{.post-icon .standard}

So where is open source in the nptech ecosystem?

On 16 Jan, 2008 By mpm With 4 Comments

I\'ve had a few interesting things happen lately which is making me wonder about what\'s happening with open source, and the perceptions of open source in the nonprofit sector. As you know, NOSI is doing a survey on the use of FOSS in the nonprofit sector. It\'s been quite slow - we have only gotten about 85 responses (so please, please, if you haven\'t yet, fill it out.) I know that surveys only get small subsets of the communities they are trying to assess. But this seems very low, considering that probably 5,000+ people saw the announcement (adding up the totals for the varied list subscriptions.) Also, we have been accused in creating a biased survey. In a sense, we are, of course quite biased. It is NOSI\'s purpose to advocate the use of FOSS in the sector. But I wondered about whether or not simply being in that position means that we will garner certain kinds of responses, and not others (interestingly, though, 25% of those filling out the survey use open source less than daily, and considering Firefox, I thought that was interesting.) Another interesting thing was that I wrote an article for TechSoup, on \"The True Cost of Free and Low Cost software.\" I got some interesting comments (especially the one that said \"the author doesn\'t really seem to understand the distinction between free, open source, and proprietary software\", which I thought was a hoot.) Anyway, they were looking for a different article - one that was more about the advantages of FOSS, not about the broad category of free (as in beer) and low cost software, which includes FOSS, proprietary, and SaaS. I said this at the end of my response:

This makes me wonder whether things have changed. In the past, people cared much more about whether or not something was free (as in beer) or cheap, and whether or not it was open source wasn\'t on the radar. Now, it seems that people well understand that acquisition cost isn\'t everything, and what\'s more important to some is free (as in libre). Perhaps it\'s time to change the message, a bit.

Of course, one can\'t base anything on two forum comments, but I wonder if we haven\'t turned a corner in the conversation. Perhaps we don\'t have to spend so much time on this issue anymore. Comments?

Continue Reading

{.post-icon .standard}

Free and open source tool #3: Dokuwiki

On 15 Jan, 2008 By mpm With 2 Comments

I have become a fan, nay, a devotee of DokuWiki. I\'ve always liked wikis, and I have used MediaWiki a lot in the past, and I do like it. Dokuwiki is different in a number of ways, most primarily in that it is one of the wiki systems that stores things in files, not databases. This means that it is easier to back up and migrate, but doesn\'t scale well. Dokuwiki was designed for small scale installations, primarily documentation and such.  The one feature that makes my day: draft autosave! I love it! One drawback is that the syntax of Dokuwiki is different than MediaWiki, and so the more I use it, the more I forget when I use MediaWiki. But I\'m converting my tech wiki to from MediaWiki to DokuWiki. I also use it installed on my home desktop, for notes, journaling and the like. It\'s a great replacement for text or word processing files. Anyway, it\'s worth checking out.

Continue Reading

{.post-icon .standard}

Update on social network portability

On 08 Jan, 2008 By mpm

Last week, I covered the Richard Scoble dust-up. Thanks to twitter (hat tip to marshallk), I learned about today\'s big news: Google, Plaxo and Facebook joined the Data Portability working group. This, of course, doesn\'t mean that all of a sudden, everyone\'s social graph and data will become portable, but it\'s a very good sign that perhaps, after all, things are moving in that direction. I think that people are getting wary of social networks where they have no control over their own data. And, of course, nonprofits should be especially keen on being able to keep control of their data. This is a good sign that things are going in the right direction. I\'ll keep you posted, for sure. Read/Write Web and TechCruch have good coverage of this. Update: LinkedIn, Flickr, SixApart and Twitter have now joined Dataportability.org. This is, of course, great news. But the real question is: will this actually result in data portability?

Continue Reading

Please take the NOSI survey

On 07 Jan, 2008 By mpm

In my work with NOSI (the Nonprofit Open Source Initiative,) I\'ve become really interested in how FOSS is used in nonprofit organizations. I think this is data we need to know, so that we can understand better what gaps are preset, and what we can do to fill those gaps. This is the first of an annual survey, and we\'ll be releasing a report next month with the results of the survey. It will take about 5-10 minutes to fill out. Please take it no matter what the level of use of open source  software is in your organization - data on as wide a range of  organizations will be helpful to us. Please encourage your colleagues and clients to take this survey as well. Take the survey

Continue Reading

{.post-icon .standard}

Another good reason for nonprofits to use FOSS

On 06 Jan, 2008 By mpm

This is an amazing example of the kinds of flexibility that is difficult or impossible to get with proprietary software. Miro, the free and open source media player, has released a Firefox plugin, which automatically inserts their affiliate code when you buy something from Amazon. It\'s really simple to use, and one doesn\'t have to have links on your website, etc. This seems like something many nonprofits might want to try out. Hat tip to Jon Stahl for the heads up.

Continue Reading

{.post-icon .standard}

Free and open source tool #2: Limesurvey

On 04 Jan, 2008 By mpm

I am in the process of writing a survey for NOSI, which you will hear all about next week. I had originally done the survey in Surveymonkey, which is a slick on-line survey tool. But, a very nice soul at MayFirst/people-link, where we house the NOSI site, set up Limesurvey for us, so we\'re using that. Limesurvey is actually quite powerful. Like many free and open source software tools, it leans toward the powerful, flexible side, rather than the slick, easy to use side. So it has its challenges. The admin interface is nice looking, and fairly intuitive. The surveys could stand some graphic design help, but you can design templates for it. It\'s a LAMP stack application. Worth a look if your organization does surveys.

Continue Reading

It\'s my social graph, darn it!

On 04 Jan, 2008 By mpm With 3 Comments

Some interesting things are happening in Web2.0 land. There has been quite the dustup, started by Facebook kicking Richard Scoble off, because he\'d violated the Facebook terms of service. As a result, Scoble joined the group dataportability.org, which I\'ve been monitoring for a few months now. Why did Scoble get booted (he has since been reinstated)? Because of a script that scraped names and email addresses from Facebook, called Plaxo Pulse. I think people are finally realizing that the current state of affairs - where we can pump data into Facebook and other social networks, but not get data out of them, is untenable. There\'s a poll on mashable.com, where the sentiment is most certainly heavily in favor of Facebook opening up the social graph. So after my brief lapse, I\'m going back to my promise: no more social networks until the data flows both ways, and I can take my social graph with me.

Continue Reading

{.post-icon .standard}

Free and open source tool #1: Thunderbird

On 03 Jan, 2008 By mpm With 1 Comments

Before the holidays, I promised that I\'d do 100 posts this year on free and open source tools. So, I\'m starting with Mozilla Thunderbird. I use it every day, nay, almost every waking minute, since email is such a critical beast. Generally, Thunderbird falls into the category of free and open source software that \"just works.\" It\'s easy to set up accounts, move mail around, and do sophisticated filtering of mail, and such. And, because it\'s in the Mozilla family, it has a plug in architecture which can add some really neat features. I\'m using one that allows me to see a calendar (I use it to view my google calendar) - it\'s a good quick way within Thunderbird to see if I\'m free on a certain day. Thunderbird is cross platform, too, so if you\'re like me, and hope between platforms, Thunderbird is there with you.  And its secure, makes doing GPG signatures and encryption easy (although I haven\'t gotten around to doing them, though. Shame on me.) Thunderbird is in the process of being spun out of the Mozilla Foundation into it\'s own organization, dubbed \"MailCo.\" I don\'t know if that name will stick. But I think that Thunderbird has suffered from the Mozilla Foundation\'s focus on Firefox, and some good solid focus on it as a product is welcome news to me, as a daily user.

Continue Reading

Last minute tidbits

On 20 Dec, 2007 By mpm With 1 Comments

This will be my last post of 2007 - I\'m taking some days off from work and blogging, and won\'t return until the beginning of the year. First, links for the day:

  • High Tech Trash - it\'s an in depth photo essay and interactive feature on the National Geographic website. It\'s sobering - as much as I love technology, it scares me how much damage it can do to both people and the environment, once we\'re done with it, and ready to upgrade to something new.
  • Many nonprofits have Linux file servers in their back offices. In a huge agreement, Microsoft agreed to share information about Windows to the Samba project, so that it can keep up to date easily. This was to appease the European Commission. This is great news.
  • OpenOffice.org is coming closer to doing PDF import. Happy dance, anyone?

I\'ve got a number of ideas up my sleeve for next year for this blog, one of which is to take up the challenge that Beth mentioned, and do 100 posts on something. So next year, I\'ll be doing 100 posts on particular free and open source tools. And now, the top ten posts of the year, according to my Google Analytics stats:

  1. Getting Naked: Being Human and Transparent. Hmmm, think it was that keyword? However, the bounce rate on that one was well below average, so maybe not.
  2. How do we make change if we keep doing things the same way
  3. Platforms Break Open
  4. The Search for Good Web Conferencing
  5. Open Source Database Solutions, Part I
  6. Carnival of Nonprofit Consultants (November)
  7. Linux, Ubuntu, Fiesty Fawn and Me
  8. Carnival of Nonprofit Consultants (May)
  9. Time to find a fundraising solution that can\'t be bought
  10. Spirituality and Technology

And, just to be fair, the least favorite post is What do you expect from a technology provider? Two whole pageviews. I find that fascinating. I wish all a holiday season full of fun, quality time with family (chosen or otherwise), and joy.

Continue Reading

The power of open source VOIP

On 18 Dec, 2007 By mpm With 1 Comments

Today seems to be Asterisk day. What is Asterisk, you ask? Asterisk is the open source PBX application that works by using VOIP. It rocks. I wrote a case study about it in the NOSI primer - it can allow for really great flexibility in building phone systems. And today, I learned about two online tools written with Asterisk, which would have been impossible a few years ago. Committee Caller seems like an amazing tool. You choose the House or Senate committee you want to call, type in your phone number, and Committee Caller will sequentially dial each member of the committee so that you can leave your comment. I haven\'t tried it yet, but I will. Rondee is a new free conference calling utility, also built on Asterisk,  which has a much nicer and easier to use scheduling interface than Freeconference.com, and some very cool features -  like if you register your phone #, you never need to enter a pin, because the system is smart enough to know what conference call you\'re supposed to be on - you\'ll just get joined to it. It seems cool, and a great alternative to freconference.com. And it\'s free, too. Asterisk made it possible for the company to provide this service without huge infrastructure costs. I look forward to seeing more of what kinds of new and interesting tools can be powered by Asterisk under the hood. Oh, and did I mention - it works really well as a generic PBX - something lots of nonprofits need.

Continue Reading

{.post-icon .standard}

Web 2.0 Experiments, snafus and stumbles

On 17 Dec, 2007 By mpm With 7 Comments

I seem to have lost my head. Really. I was all curmudgeonly until last week, when I started tweeting and got into Spock. You know why I started to twitter. Ages and ages ago, when Spock was still in private beta, I got an invite, and used it. I was underwhelmed, and forgot about it. Then, last week, I got a request from Beth Kanter and Deborah Finn to join their \"trust networks.\" Well, I already trust them, so I joined them. I then decided, why not - let\'s find out who else is on Spock. So I did the usual, gave up my gmail password. Turns out, unlike Facebook, or Myspace and such, the \"Spock Bot\" makes pages for people without their knowing. So people who were in my gmail address book, and in Spock, got a request for trust from me, not knowing where it came from. So, although I can trust Beth and Deborah, it appears I can\'t trust Spock. There have been lots of blog posts about Spock,  mostly negative. I\'m hoping that Spock ends up in the dead pool, but who knows. Then, for the creepy part. I joined Spokeo. Spokeo takes your gmail, aol, or yahoo address book and, looking at a wide variety of web 2.0 communities, from LinkedIn to Flickr to ... Amazon.com, keeps track of your contacts content. So when someone in your addressbook posts a new photo to Picasa, or tweets, you\'ll know about it. Creepy part: do I really want to know what\'s on my ex-girlfriend\'s MySpace page? Or that a certain nonprofit Executive Director Dugg a post about starting a video game company? (Although I do have to admit its fun to know what a very old friend is listening to on Pandora.) What have I learned in all of this? What my colleagues and friends do has influence. I did set a pretty high bar a while back for the next social network I\'d join. And what did I do with the influence of colleagues and friends? Walk right under it. This is not at all to blame them, it\'s just to state a reality - what other people (those I trust and follow) do matters, and I think it matters for most people. What else have I learned? Privacy matters. I happen to be someone who has had a relatively high online presence since before the web (remember Usenet?) I\'m someone who has, since day one, tried my damnedest (and succeeded 96% of the time) to only say by email, or put up, what I would say in a room full of people. But for a long while, it took a lot to gather all of that information. No longer. The tools are getting better and better, and one of the hallmarks of Web 2.0 - the APIs, make it all the more simple to aggregate all of someone\'s online content. I think I\'m going to wait at least a few weeks after getting an invite to the next web 2.0 tool to jump in. Or perhaps maybe I won\'t even. What a concept. Maybe it\'s time to go back to being a curmudgeon.

Continue Reading

What was it, the question mark?

On 14 Dec, 2007 By mpm With 4 Comments

I feel misunderstood. Earlier this week, I wrote a post on about the NTEN CRM satisfaction survey. I\'ve now seen two posts (one from David Gielhufe, and one from Lobo of CiviCRM) suggesting that I dismissed the extremely positive results for the open source CRM tools (particularly CiviCRM) because the sample wasn\'t large, or representative of the sector. The whole point of the post was to crow about how positive the open source results were. But if I crowed about those results, without making sure that people understood that the sample was small, and not representative (which is impossible to argue against) I would be irresponsible. No concrete conclusions can be drawn about overall use of or satisfaction with CRM tools from this survey. It wasn\'t scientific, and the sample was about .05% of the nonprofits in the United States (the foundation center says that there are about 1.4 million of them.) How could that be representative? NTEN did a great job of beginning to approach this topic, and it was great data. To my mind, it bodes well for the open source tools. That was my point.

Continue Reading

Movable Type goes Open Source

On 13 Dec, 2007 By mpm

This is old news, sort of. A ways back, Six Apart promised that it would open source MovableType, their flagship software product, and the software that underlies their TypePad service. Yesterday, they finally released it. This blog (and my personal blog) were on TypePad for years, and I rather like the MoveableType interface and feature set. Their new version, MT4, looks pretty good, and it\'s a great thing that it\'s now open source (released under the GPL v2, interestingly enough.) I\'m liking Six Apart more and more these days. They are really putting their money where their mouth is, in terms of working toward more openness. They\'ve been supporting open standards for years. Had this happened 6 months ago, when I was ready to migrate my blogs, I would have just migrated them to MT4, instead of WordPress. But, that said, I like WordPress, too. I imagine that this is a bit of a response to WordPress (others think so too.) It will be interesting to see how this all plays out - both are incredibly strong applications. MovableType is written in Perl, which I hadn\'t realized until I was doing research for this blog entry. But in any case, it should work on all generic hosting environments, and it looks easy to install. So here\'s another good option for organizations that want powerful blogging software to use on a generic (read: cheap) hosting environment. Hurray!

Continue Reading

{.post-icon .standard}

Open Source CRMs - people like them?

On 12 Dec, 2007 By mpm With 3 Comments

I had a good look at NTEN\'s CRM Satisfaction Survey (yippee for data!), and although the sample sizes were small, and not representative of the nonprofit sector as a whole, the people surveyed seemed to like the open source tools available. There were 6 open source (or sort of open source) tools that showed up on this survey. They included CiviCRM, SugarCRM, and vTiger (which is actually a modification of SugarCRM), all with vibrant developer and user ecosystems. The three others are Democracy in Action, which is a SaaS that is open source, CitySoft says it\'s open source, but I don\'t know whether it is through an OSI approved license (since they don\'t say. Taken at face value, CitySoft certainly doesn\'t violate the letter of the law, since you can get the source code if you buy their product, but their source code is unavailable otherwise, it sort of violates the spirit of open source.) Finally, Organizer\'s Database is open source, but written on top of a proprietary platform (Microsoft Access). 201 out of 665 users used these 6 open source tools. I don\'t think that\'s possibly representative of the sector (especially since in the survey, the most popular CRM was CiviCRM.) That said, for the most part, except for CitySoft and vTiger, people seemed very satisfied with these tools. CiviCRM was first in satisfaction, SugarCRM, Organizer\'s Database, and Democracy in Action were 3rd, 4th, and 5th, respectively. That\'s pretty impressive. Among those surveyed, 4 of the top 5 tools in terms of satisfaction were open source (or sort of open source) tools. The only other tool in the top 5 was Salesforce. Satisfaction with Convio, Kintera and Blackbaud all trailed these top 4 tools. We really can\'t draw any conclusions from this - the sample size was small, and, as I mentioned, not representative of the sector. But it\'s a very good sign that people seem satisfied with the open source tools available for one of the core functions of nonprofit organizations.

Continue Reading

LinkedIn suits up

On 10 Dec, 2007 By mpm

LinkedIn, the serious MBA wielding brother to the Facebook fratboy and the MySpace rockergrrl, is really putting on the suit now. They\'ve included some new features like a new personal homepage with things like \"Company Updates\" - news about your company, and other business-friendly features. Also, they have a partnership with Business Week - so you can see how you are connected to companies and indivuals covered by clicking on links. It all sounds like LinkedIn wants to pull all of those people who have been migrating to Facebook back into their fold, with the idea that LinkedIn is serious about business. It\'s an interesting strategy. So, how is this relevant to nonprofits? I expect that this will enhance the appeal of LinkedIn for nonprofit executives, staff, and consultants for our own networking needs. I think in some ways, this might decrease LinkedIn\'s usefulness as a platform for fundraising or constituent-building by nonprofits (it has always seemed less viable for this than either Facebook or MySpace.) Hat tip to Marshall Kirkpatrick who tweeted about his Read/Write web post (wow, twitter has already come in handy.)

Continue Reading

Why I\'m twittering

On 07 Dec, 2007 By mpm With 1 Comments

Yes, it\'s hard to believe. I succumbed. I have said many times that I wouldn\'t Twitter. I\'ve critiqued Twitter and social networks in general. So what\'s the story? Michelle twittering? There are a number of factors at work. There seemed (to me) to finally enough interesting people and things to follow on twitter - it began to seem like microblogging was more that just about what kind of tuna sandwich someone was eating (although it sometimes still is that.) There are substantive conversations that happen, and real information gets shared. Since I already was a facebook status addict, once I learned that I could basically make twitter my one-stop status shop (put it in my blog, on facebook, wherever I wanted) that seemed to make sense to me. But, the biggest reason that I\'m twittering is that as the Coordinator of NOSI, I\'m experimenting with Web 2.0 in general, which includes Twitter, Facebook, and others - with the goal of crafting a strategy. I think given the audience that NOSI is reaching, and wants to reach, a concrete, well thought out Web 2.0 strategy that includes a whole host of tools, including Twitter, is a darned good idea. So, if you want to follow me on Twitter, please do.

Continue Reading

Open content business models

On 30 Nov, 2007 By mpm

I\'m at the Open Translation event, and we\'ve just had a great session on open content business models. It was very useful, and interesting, and gave me lots of food for thought. I\'ve been interested in issues of how we sustain open content for a long while. I was the note-taker for the session, and I feel like there are a lot of great ideas out there. In general, it seems like most models depend on some sort of up-front funding, whether it be an investment or a grant, to fund the initial writing of a large amount of content. The problem of how do you fund the actual writing of content was not really addressed, and I think that is one of the harder nuts to crack. There was one interesting model was asking for pledges, and if the pledges got up to a certain amount, the content would be produced. But ongoing sustainability of already written open content seems to have been at least conditionally solved by a variety of folks in a variety of ways:

  • Training and consulting based on existing content
  • Generating revenue by doing print on demand, with a markup
  • Production of corollary items such as t-shirts
  • Hybrid model - most content is free, some content is closed, and paid for
  • Advertising on a site with open content
  • Corporate sponsorship
  • \"Robin Hood\" models: asking larger Northern organizations to subsidize the distribution of content for the developing world

This is very interesting fodder for my thinking about the puzzle that is how to make NOSI a strong, sustainable organization. The thing we have actually done the most of is write the primer, and I\'ve got more ideas for types of open content that NOSI could get involved in doing, so these suggestions for business models are quite welcome.

Continue Reading

What I\'m learning

On 30 Nov, 2007 By mpm With 1 Comments

It\'s been mostly fun so far at the Open Translation event here in Zagreb. I\'ll leave the complaining about Croatian food and other things to my personal blog, when I get the time. The event itself has been fab. As one of those monolingual American types, I\'m learning a huge amount about what it takes to create open content in different languages. It is actually pretty mind-boggling. There are issues that relate to encoding, fonts, and character sets, machine translation, interfaces to facilitate human translation, issues of workflow, volunteer and project management, and a whole host of other issues. It\'s also really interesting to see how free and open source fits into all of this.  What are the tools like? How do we replace proprietary tools? How does this all get paid for? My role has been to gather up the use cases (specific examples of translation processes). That\'s been a very interesting process, and we have been generating some good examples that will be really helpful in the process of figuring out what tools are present that can do what\'s needed, and what gaps exist. Check out the wiki. Lots of food for thought for NOSI and the future.

Continue Reading

On my way to Zagreb!

On 26 Nov, 2007 By mpm

I\'m going to Zagreb, Croatia to be at Aspiration\'s Open Translation event. I\'m really looking forward to it. It will be my first international open source event, and it\'s an amazingly interesting topic. So I am so excited to be going. I have been tasked to be \"use case librarian\" which is very cool, since I am a real fan of use cases. I\'ll be posting pics to flickr, for sure!

Continue Reading

Wiki Syntax madness

On 26 Nov, 2007 By mpm With 2 Comments

As most people deeply imbedded in Web 2.0, I am an avid Wiki user. I have become a complete devotee of Dokuwiki, which I use locally on my laptop, for my to do lists, notes, etc. I love it because it\'s really easy to set up and back up (it\'s all files, not in a database,) and it\'s has draft autosaves (yay!). I have two other wikis (a public and private wiki) that are in Mediawiki, on my web host. And I contribute to varied other wikis, which are on varied other wiki platforms. And none of these have the same syntax - they are similar, but slightly different. Different enough to drive me crazy. A while ago, when I was still developing web applications, I wrote a wiki plug-in for this behemothic open source CMS/Web database system that I wrote, and has (mercifully) died a slow death (there are still a few installations of it in use, hopefully soon to be retired.) I didn\'t get so far into coding the markup, but I had decided that I\'d follow MediaWiki\'s syntax, since it was the most popular wiki software. I just wish that somehow, the  gazillion wikis out there could decide on syntax they all would agree on. I doubt it, but it would be nice.

Continue Reading

Why I won\'t be buying a Kindle

On 26 Nov, 2007 By mpm With 3 Comments

I think I might need a new blog category: why I won\'t be buying ... First Leopard, now, Kindle. Kindle, at first blush, sounds pretty cool. I\'ve been waiting for devices using the e-ink technology for a while now. And, I\'m an avid reader, so the idea of being able to carry a bunch of books with me in a small package (instead of the usual very heavy pile I travel with) is quite appealing. And \$10 a book is great - I love the idea of not using all of that paper. But ... Why am I not buying a Kindle?

  • No wifi - uses Sprint\'s EVDO network (for me, that makes the Kindle basically a brick when I\'m at home.)
  • Closed - can\'t upload open document formats
  • Can only buy books from Amazon
  • I\'m still not clear about what happens when you buy your 201st book - do you have to throw out one?
  • Can\'t share books - I like to loan out my books to people.

I want something like a Kindle that:

  • Has wifi (EVDO would be a nice addition, but some of us live in areas where the EVDO network does not go)
  • Allows me to share books with people.
  • Allows me to upload any open document format (.odt, rich text, pdf)
  • Allows any vendor of books (like Lulu, for instance) or any independent author to provide books for the device.

Sorry, Amazon. Kindle is a bust for me.

Continue Reading

{.post-icon .standard}

Open source your Open Social Apps?

On 21 Nov, 2007 By mpm With 2 Comments

Beth\'s wonderful post about a decision tree for whether or not an organization should get into the social networking business had a link to a comment about OpenSocial. The salient quote:

Why not roll your own social network, include the OpenSocial API, and have applications, groups, widgets and portals to your site in any number of the "OpenSocial" platforms? Whether an existing member of your organization chooses to participate in any social network or not should not affect your decision to have a presence (group and/or application) in the social-networking space.

Which lead me to think about the idea of open sourcing OpenSocial apps. It seems to me that many organizations are going to have very similar needs in terms of kinds of applications. Can we build a library of OpenSocial applications that have open source licenses? Anyone interested? Maybe this is the use for opensocialorg.org! :-)

Continue Reading

{.post-icon .standard}

Carnival of Nonprofit Consultants

On 19 Nov, 2007 By mpm With 5 Comments

Today, it\'s my turn to host the Carnival of Nonprofit Consultants. It was an open call, so there are a wide variety of posts to talk about.

Keep track of the Carnival of Nonprofit Consultants, no matter which blog is hosting, by subscribing to the Carnival feed.

Continue Reading

Linux Desktop Migration

On 16 Nov, 2007 By mpm

Linux has proven itself as a server platform - no one really questions it. A large chunk (the majority?) of nonprofits already use Linux server-side - either in-house, or if not, their web host usually does. But can it really be a desktop platform for nonprofit organizations? Linux on the desktop has come quite far, in just a few years. And recently, there is increasing evidence to suggest that it can, indeed in large part, replace Windows on the desktop. Why should it? Linux is more secure, more stable, and can be used on older hardware. Walmart was selling \$200 PCs running gOS (no, that doesn\'t stand for googleOS, but greenOS, based on Ubuntu 7.10,) and they sold out. If you read the reviews (most of which were quite positive,) the people who liked it were looking at the real functionality (it could edit their documents, it could surf the web, read email, etc.) and those who didn\'t, seemed not to like it mostly because it doesn\'t run Windows (although one could install Windows on it - but it\'s going to be pokey - it\'s not a well powered machine, but more than enough for Linux.) So, if students and Grandma can use Linux, can nonprofits? There is a good whitepaper that was released this fall from Novell, which has a section which talks about what to think about with enterprise migration to Linux on the desktop. It basically echos what I would suggest when thinking about a mass migration:

  • Planning is key
  • Do a software inventory - figure out:
    • What has a version that runs on Linux
    • What can be replaced by software that runs on Linux
    • What can be run in an emulator such as VMWare
  • Identifying types of users (by what they need to do)
  • Choose a distribution that makes sense (I wrote up a review of Linux distros recently.)
  • Figuring out a clear migration strategy that takes all of this into consideration.

Continue Reading

Facebook Ad Platform

On 13 Nov, 2007 By mpm With 3 Comments

It always takes me a bit to digest new Web 2.0 news, so I\'m just now blogging about last week\'s news that Facebook launched a new ad platform. The platform contains two parts, which I\'ll talk about individually. It\'s an interesting time, and there is a good question - is this something nonprofits should jump on? The first part of the package is Facebook Fan Pages. A company (or organization, or individual) can set up a public page (so it does not require a Facebook account to see it). Individuals can become \"fans\" of that product or organization. That shows up in user\'s news feeds, and in their profile. That, of course, can spread virally. If a friend of mine becomes a fan of an organization or company I happen to like, I might then become fans of them as well. One of the big beginning issues was that if a fan page was public, and it can have URLs, then that must be great for SEO, right? Facebook pages have huge page rank. Well, no link love from Facebook - apparently, those links have the dreaded \"rel=nofollow\" qualifier. But is it still worth it for nonprofits to have a fan page? Many already have Causes set up. Jeremiah Owyang, who\'s blog I read occasionally, has a good point: figure out what your strategy is, first! The second part of Facebook\'s advertising platform is likely not going to be used by nonprofits, for the most part, but is important to consider if Facebook is a part of your constitiuent-building or fundraising strategy. Social Ads are sponsored advertisements that are linked to users profile data, social graph, and activities. Ads can be targeted by profile data. Also, if a friend agrees,  their activities around a particular product (like, say, a movie rental) will show up on their news feeds. Of course, there are huge privacy issues, here, and one law professor thinks the ads are illegal. What does this mean for nonprofits? Well, it depends. Will this advertising platform alienate users? And, more importantly, will nonprofit messages get lost in the stream of news feed posts about Joe being a fan of Apple, and Jane renting a four star movie from Blockbuster?

Continue Reading

Open Social != Open Data

On 08 Nov, 2007 By mpm

As the hype (which, I agree I have contributed to) around OpenSocial dies down, the reality behind OpenSocial becomes clear. Tim O\'Reilly has a bang-on post about the fact that OpenSocial does not mean that users can have data portability. Apparently, the data stays in the container (the social network site) and probably can\'t move beyond it. Tim says:

If all OpenSocial does is allow developers to port their applications more easily from one social network to another, that\'s a big win for the developer, as they get to shop their application to users of every participating social network. But it provides little incremental value to the user, the real target. We don\'t want to have the same application on multiple social networks. We want applications that can use data from multiple social networks.

TechCrunch suggests the issue is in the business model:

Unfortunately, the business models have not been worked out yet to accommodate such mixing of data. If a social mashup starts making money from ads, how would that be split up between the host site, the app developer, and all the other applications or social networks from which that mashup pulls data? O'Reilly doesn't really have an answer for that one.

I don\'t really have an answer to that one either, but for our sector, that\'s really where the power is going to lie. Sure, some very savvy organizations will do well if they have to develop only two (or one) app for social networks. But it\'s the remixing of data from many networks that provides the real win for users and nonprofits.

Continue Reading

The evolution of web hosting

On 08 Nov, 2007 By mpm With 2 Comments

It seems like not so long ago that I helped an organization build a Linux email and web server, that we plugged into a college internet connection, so that they could begin to take advantage of the wonders of the internet. It was, at the time, the only affordable way to do it - there was no broadband, and a T1 was far outside of the realm of affordability for nonprofits. I even remember writing a grant to some federal agency that probably no longer exists to help create a local infrastructure to get nonprofits online. When was that? 1995. It wasn\'t so long after that that virtual hosting companies became ubiquitous, and affordable for nonprofits. But it\'s only been in the last few years that mega storage, and mega processing power were available to organizations to power big web applications and the like. Amazon seems to be leading in the next wave of evolution of hosting - pay only for what you need, when you need it. They started out with their S3 - simple storage service. And now, there is E2 - Elastic Compute Cloud - use only the storage, processing power and bandwidth you use. I did a quick calculation of what my own usage might be, and actually, my Dreamhost account is a better deal. But for much larger/high-traffic sites, or sites that fluctuate a lot, it might be a great idea, especially if you want dedicated hosting. The news today, and why I\'m bothering to talk about this, is that Red Hat announced that it will offer RHEL - their enterprise distribution, on Amazon E2. If a nonprofit organization has a server, it\'s actually not so unlikely that it is running RHEL. A lot of organizations of all types want support, and are willing to pay for it, and Red Hat is, at this point, built the best business model around this than any other distro (Canonical, with Ubuntu, is sneaking up behind, but I\'m not sure it has the \"enterprise\" style some people look for.) So running RHEL on Amazon E2 is a potentially low-cost, low-pain way for nonprofits (with appropriate levels of tech staff, of course) to dip their toes into hosting complex applications on Linux, without having that noisy box in the corner.

Continue Reading

Online Courses

On 06 Nov, 2007 By mpm

I\'ve been thinking a lot about giving online courses in the use of open source software. That was one of the big reasons I had been looking for good web conferencing a while back (I\'m still looking...) I\'m starting my foray into this territory by giving a free course in the database management system, PostgreSQL. It will be given under the auspices of Linuxchix, an organization I\'ve been a part of since 2000.  The course is a 12 week course, starting on November 19th (with time off for the holidaze.) Check out our Moodle page for more details (the outline, and information on how to participate is there.) I\'ll be following this up with an Open Source 101, starting in early 2008, on my own Moodle site.

Continue Reading

Open Social Networks

On 05 Nov, 2007 By mpm With 2 Comments

As I\'ve mentioned before, there has been a lot of thought and interest in the issue of opening up social networks, outside of what has been, until last week, totally closed silos. I linked to a great thought piece a while back on opening up the social graph (that is, your network of friends.) Jon Stahl pointed me to a great article, also about opening up the social graph. There is a Google group (called Social Network Portability), that you should definitely join if you are at all interested in these issues. Of course, OpenSocial has blown the doors off of all of this, and what\'s come out of it is quite interesting. Folks on that list are beginning to talk about how to implement portability. There are an amazing number of new sites that have launched over the weekend (I imagine people sitting in their home offices with lots of caffeine and pizza.) Here are some I\'ve found so far, that I\'ll be following:

  • OpenSocial Zen - meant to be a place for developers to share ideas. They haven\'t really started yet, but hopefully it will be an interesting place to watch
  • OpenSocial Directory - a directory of the apps that already exist to use OpenSocial (talk about caffeine and pizza!)
  • OpenSocializr - a Ning social network on OpenSocial (I guess that\'s logical)
  • OpenSocialBlog - an interesting blog about OpenSocial

Why do I have the feeling that every domain with \"opensocial\" is taken (opensocialblog.com, opensocialcats.com ...) So why is this important for nonprofits? First off, it means that in the short term, it will be possible to write just one application, and reach multiple social networks, thus expanding reach. In the long term, if the whole web 2.0 cyber world becomes, instead of a bunch of walled gardens where data moves in only very limited ways between them, a fully permeable space where data flows freely, it will be possible for nonprofits to have much greater reach and impact, whether it be for fundraising, advocacy or constituency-building.

Continue Reading

{.post-icon .standard}

More good news from Google: Open Handset Alliance

On 05 Nov, 2007 By mpm

This isn\'t actually, nptech news, per se, but it\'s good news for nonprofits: Google, along with other partners, such as T-Mobile, Qualcomm, and others, have created an alliance called the Open Handset Alliance, and a phone operating system called Android, which will put open source software on mobile phones. This is big. This means that anyone can hack their phones - and a raft of developers can create really interesting kinds of software for phones. The SDK will be available later this month. Of course, the bottom line is that this makes it more likely that Google can get their ad platform onto phones. But they seem to realize that the key to their success is being open, and they are doing their best to move that into as many places as possible. And just like OpenSocial was a great answer to Facebook, this is a great answer to the iPhone. Why is this good news for nonprofits? Katrin over at MobileActive.org weighs in, and I agree:

So what does this mean for the \'mobile for good\' field? We hope that this will spur development for more social applications and mashups as well as better distribution of these applications worldwide. For example, HiV Aids rapid information and testing services built on mobiles, climate and poluution monitoring applications, mobile information services that provide consumers with point-of-purchase environmental or other information services about products, mobile human rights monitoring applications, mobile social and organizing networks for trafficking or domestic abuse victims - the list of potential applications is as endless and varied as there are civil society causes.

I\'ll be watching the Open Handset Alliance, and wondering when I can replace my Blackberry with an open phone.

Continue Reading

Why I won\'t be buying Leopard

On 01 Nov, 2007 By mpm With 3 Comments

As many of you know, Apple\'s newest version of the Mac OS, 10.5, shipped just a few days ago. I have been an Apple user since 1980, and a Macintosh owner since 1987. I have owned about a dozen Macintosh computers (or clones) over the course of 20 years. I still own a Mac mini, which I expect will be my last Macintosh, and I won\'t be upgrading that Mac mini to Leopard. Those of you who are loyal Mac users are gasping. I\'d gasp if I read this a couple of years ago. The Macintosh operating system has without question, the best, most intuitive user interface ever invented, built on top of the best OS invented, UNIX. Things \"just work\" (for the most part - apparently Leopard has been having issues.) And I\'ve been quite happy that the Mac OS is gaining market share over Windows - it would be great to see that continue. There are a number of reasons for my deciding to slowly leave the Macintosh platform:

  • I want to focus more energy and time on free and open source platforms - I might donate what I would have spent on Leopard to some deserving projects.
  • I\'m not liking Apple\'s increasingly closed and monopolistic tendencies when it comes to the iPod and iPhone.
  • I don\'t use my Mac much anymore - I migrated to Linux as my main desktop, and will be sticking there. I do have a few things I need to migrate, including time tracking (I\'m starting to use GnoTime to do time tracking,) PIM data (I haven\'t decided which avenue I\'ll be going, but I\'m definitely migrating that data this month to Linux), and music (which will be hard - I have quite a number of DRMed iTunes Music Store albums I will have to painstakingly convert.) The only thing that will be left is games.
  • I like building my own systems - I need a new desktop, and I like the idea that I can build my own easily, and get a fair bit of power fairly cheaply.

It\'s been fun, these 20 years with Macintosh. It seems a fitting moment to say goodbye to Apple.

Continue Reading

What OpenSocial Means

On 01 Nov, 2007 By mpm With 3 Comments

The buzz of the blogosphere is the announcement of Google\'s OpenSocial. I thought that it would be a good idea to describe what it is, and what it might mean for the nonprofit sector. Marc Andreessen, who is, of late, connected to Ning, has a great blog entry with details. OpenSocial is a set of APIs. It\'s aimed primarily at developers. Google has a number of partners, including social network sites like LinkedIn, Friendster and Ning, as well as Salesforce, which does have very interesting implications given the increasing use of Salesforce in the nonprofit sector. OpenSocial is a set of APIs that handle three different kinds of user data: profiles, social graph (who your friends are) and activities (the stuff of the Facebook news feeds.) And the language of these APIs are standard HTML and Javascript. Any application written for OpenSocial will work on any partner social network - any OpenSocial \"container\". That means developers need only write an app once, and it can get used on any of the networks involved, like Orkut and LinkedIn. Basically, if the more social network sites that adopt OpenSocial, the more open the whole thing gets. One of the big issues about social network platforms was that once Facebook made its platform available, and MySpace and LinkedIn followed, it looked like developers would have to port their apps to each social network. OpenSocial means, basically, they can port to a whole lot fewer of them. Hopefully, eventually, they can write their apps just once. Facebook has quite the motivation to keep people on Facebook, and keep the eyeballs there, because of their revenue model, which is ad-based. This breaks the whole thing open. I\'m not so clear about how this helps users. I expect, that because the APIs allow connections to profile, social graph, and activity data of users, that portability and permeability between social networks is bound to happen. But the path to truly portable (with adequate privacy controls) profile, social graph and activity data is still not entirely clear. What does this mean for the nonprofit sector? Allan, in his inimitable style, talks about how most nonprofit organizations will not be able to take advantage of OpenSocial. No question about that. Most nonprofits haven\'t even begun to take advantage of the Web 2.0 world in general, let alone the bleeding edge of OpenSocial. And I\'m not entirely clear yet how many should be jumping on this bandwagon to either do fundraising or community-building. Friendster, Orkut, Hi5 and LinkedIn have very different demographic and geographic reaches. Ning, which is the social network of social networks, could end up being a very important player here. I think that the inclusion of Salesforce in the mix will be very interesting for web-savvy nonprofits who are thinking about, or have started writing apps for social networks. It will be very interesting to see how this plays out - what kinds of integration will be possible between social network data and CRM data? Anyway, OpenSocial is something I\'ll be watching, playing with, and writing about as time goes on, and considering what it means for those of us in this sector. Update: MySpace, SixApart (LiveJournal, Typepad and the newish social networking blog platform Vox), and Bebo have now all joined OpenSocial. This is getting really interesting!

Continue Reading

Satellite Internet gets the boot

On 31 Oct, 2007 By mpm

Finally, I have real broadband. We moved last weekend to a town that has actual real cable high speed internet. Unfortunately, it is the apparently increasingly dastardly Comcast, but it is so way much better than satellite by Hughesnet, that I can\'t really find it in myself to complain. Traffic is flying at four plus times the speed, I can actually do remote shell sessions that aren\'t painful. I can Skype again (once I figure out how to use my USB headset on Linux.) I have to admit to being a very happy camper. cablespeed.png

Continue Reading

{.post-icon .standard}

Making a better, more findable blog

On 24 Oct, 2007 By mpm With 3 Comments

I\'ve gotten some nice kudos for my blog in the past few days, and it feels nice to know that people read the blog, and get something out of it. I want to make the blog better, and also, more findable. I\'m not going to embark on the 31 days to a better blog challenge, it\'s a bit too much for me to plunge into, but I will be doing bits and pieces of it over time, as well as delving deeply into search engine optimization (SEO), which is a topic that I have pretty much only watched from a distance for far too long. I\'ll be blogging on that as I go through it, for sure. As a first start, I have a new poll. I hope you\'ll take it! [poll=2]

Continue Reading

Book Reviews

On 21 Oct, 2007 By mpm With 3 Comments

I read three books recently that I thought would be worth reviewing here. They fall into that category of \"business\" books that I basically never read. I came upon these three for different reasons, and although I\'m not interested in adopting their primary points of view, there were tidbits that were worth the read (or a skim, in one case) in the end. The first book, called \"Made to Stick:Why Some Ideas Survive, and Others Die.\" I picked it up by happenstance, I think because I had NOSI on my mind, and I was thinking about how to talk to people about open source and nonprofits. It\'s an interesting book, with a basic premise. in order to get an idea to stick, the idea needs to be Simple, Unexpected, Concrete, Credible, tug at Emotions, and tell a Story (their acronym is SUCCES). They do a good job of using examples for each of these things (like why Southwest Airlines is so successful, and why the Kidney Heist urban legend sticks so well, among other stories. It\'s worth a read, I think, if you have a message to get across. The second book, called \"Elements of Persuasion\" is a book that was sent to me for free - someone thought that I might want to review it on my blog. This was the book I skimmed, because, honestly, I was bored after the first chapter. It basically only focuses on the last \"S\" part of the first book: storytelling. It uses examples and such, but it is not anywhere near as engaging and readable as \"Made to Stick.\" There ae a few interesting and useful tips, but if you are only going to buy one book about getting your message across, buy the first one, not this one. Although in most cases, these books are designed for people who want to get more business (the first book not so much - they have some good nonprofit examples.) The third book is in a bit of a different category - not about landing more business, per se, but making more money. It\'s called \"Value Based Fees.\" It\'s written by this guy, Alan Weiss, who has written the \"Ultimate Consulting\" series, which seems to be focused primarily on making a lot of money in consulting. I would never have bought this book if it didn\'t come recommended by a colleague who I respected. I mean, the cover has all these dollar bills on it! To explain a bit - he does big money consulting with huge Fortune 500 companies, and does projects for hundreds of thousands of dollars that result in the companies saving, or making, millions. A very different context than I, or most people reading this work in, for sure. All of that said, he had some very interesting perspectives. One of which is something I would love to talk with other consultants (and clients, too) about. He thinks that time-based billing is bad. His reasons are interesting. On one hand, he feels that consultants should base their fees on the value they bring to the consulting relationship, not the time spent. He feels that there is an inherent conflict of interest in working for time - it is in the consultants interest to spend more time on the project, regardless of the outcome. And he thinks that deliverables are also problematic. He thinks that ultimately, all consulting relationships should boil down to the ultimate results for the client. His examples are things like saving millions by reducing employee turnover, increasing profits by streamlining processes, etc. Not about how many hours you spent at the client office, or how many reports you wrote. Really, what he thinks is that these forms of billing reduce the fees you can charge. It\'s a little odd, because mainly what he\'s interested in is making more money. But some of his ideas are interesting, especially the notion of setting the fees on the value you bring, rather than the time you spend. I\'m not sure how to make the translation to nonprofit consuting, but I do find it interesting how blanket his rejection of time-based fees are. And I do, certainly see his point about conflict of interest - if we charge by the hour, we have an interest in spending more time. My favorite consulting book is still \"The Consultant\'s Calling\" which, in some ways is diametrically opposed to the values of this book. But, there are some useful ideas to mull over.

Continue Reading

Platforms break open, part II

On 18 Oct, 2007 By mpm

The dust is settling. I looked over Allan Benamer\'s post on the Convio and Kintera initiatives, I looked harder at the Convio Open and Kintera Connect docs, and I also had a chat with some Kintera folk. I have a few comments. Allan is right - the Kintera API is more comprehensive, and provides for more flexibility than the Convio API. Of course, the API was only one part of Convio\'s initiative, so I do still think they come out ahead, a bit. But it may well be that for more complex integrations, the Kintera API will provide more power. REST vs SOAP: Kintera seems to have chosen the \"more power, harder to code\" choice. I could argue it either way. Methinks vendors in this space still just don\'t grok, really, what \"open\" means. While I appreciate that one can, theoretically (I have yet to test it) easily become a \"partner\" with either company - but that doesn\'t quite count as open. Allan hit the nail on the head when he said:

Again, this is a lesson in Web 2.0 transparency both for the sector and the vendors who serve it. Control? Let it go. I really mean that. From both a business point of view and from the point of view of how our sector should work to heighten transparency in society at large, there's no reason to limit the ability of coders to learn about and discuss the [API ]{.caps}at hand. And the big guys have already done this work, check out the way Google and Amazon distribute their [API]{.caps}s. Those shine as industry-standard examples of how open [API]{.caps}s need to be distributed.

He\'s right. Open it up, let anyone bang on test data to try things out, and you never know what might happen. The drive toward open everything is pretty inexorable, and the pressure is only going to get greater.

Continue Reading

Happy Birthday, Gutsy Gibbon!

On 18 Oct, 2007 By mpm With 1 Comments

Ubuntu Linux has a new release, version 7.10, called \"Gutsy Gibbon.\" (Really I don\'t know where these names come from!) There is a great review at Wired that gives a good overview of what you\'ll find. They say, among other things:

Gutsy Gibbon is certainly easier to install and set up than Windows Vista, and it\'s very close to matching Mac OS X when it comes to making things \"just work\" out of the box. Wi-Fi, printing, my digital camera and even my iPod all worked immediately after installation -- no drivers or other software required.

I\'m in the middle of moving, otherwise, I\'d be checking it out immediately. Once I get settled in a couple of weeks, I\'ll be giving it a spin, for sure. It seems that with Ubuntu, Linux is getting closer and closer to being a completely viable and usable desktop for everyone.

Continue Reading

Platforms break open!

On 15 Oct, 2007 By mpm With 5 Comments

If you are new to this site, you might want to read more, and subscribe to my feed. One of the wonderful things that has happened since I wrote the Open API whitepaper way back in January, is that finally, vendors are realizing how important openness really is, and are beginning to implement things in a big way. Two new initiatives have come to light in the last couple of days, one from Kintera, called \"Connect\" , and the other from Convio, called \"Open\". They are both worth having a look at, especially if you are considering either implementing a web application platform, or if you are a consultant type looking for ways to integrate data for your clients. At first blush, although Kintera officially got out the door first, announcing Connect weeks ago, and delivering the APIs and docs on Friday, their play is a good start, but Convio, announcing Open tomorrow, appears to be ahead in terms of providing real openness. Here\'s a quick overview of both initiatives. You make your own conclusions. Kintera\'s Connect has an API that can do some very important things. It allows you to access 16 entities within the Kintera application, including lots of data about contacts, plus data about appointments and tasks. The API is SOAP. One of my favorite quotes in the Connect documentation is this one: \"As long as you can invoke the API over HTTP, your application can be Microsoft, HP, IBM, Novell, Oracle, even Sun-based. (emphasis mine) \" ooooooh... During the NTEN call on Connect, they had mentioned that they were only going to publish sample code in C# and Java. It appears, from perusing the documentation, that someone in Kintera saw the light, and included PHP code. In any event, Kintera\'s API goes a long way to help organizations be freed from yet another data silo, and they are free. Convio has, seemingly, gotten some serious Web 2.0 religion. Open has 3 components, APIs, Database Connectors, and Extensions. The geek in me thinks the Convio APIs are wicked cool, since they allow you to do client programming via AJAX, as well as more standard server-based programming. They work by REST via POST, or JSON. Their code on the server-based method examples only include PHP at this point (the client based method code examples are in the expected HTML and Javascript.) Database Connectors are specific tools to help people connect Convio and specific apps, including Blackbaud\'s Raiser\'s edge, and ... Salesforce! Extensions are ways to connect the Convio app to other Web 2.0 apps out there. They\'ve got this great Facebook application - basically a template that allows an organization to create their own Facebook app. Extensions also include the pantheon of Web 2.0 gods: Flickr, RSS, Google, etc. Convio\'s APIs and Extensions are free, but the Database connectors have consulting costs associated with them, and that makes sense to me. Bottom line: Kintera takes some important steps to open up their application. Convio takes more, bigger steps that appear to eclipse what Kintera has done. But I think time will tell. I have some advice for both companies, though:

  • Keep going - it\'s looking more and more like not only are people expecting the ability to mash their data and other data more, it\'s also looking like a pretty good business model. Create and foster developer and user communities in the same way that Salesforce and open source communities do. Speaking of communities ...
  • Open up beyond your \"partners\" - Really getting involved in writing apps for either platform requires that people become official partners of the companies. You\'re going to get a much more vibrant developer community involved in developing new stuff for your platforms if you eliminate hurdles. What\'s to lose?
  • Deliver, don\'t just hype - of course, marketing is important, but when the rubber really has to meet the road, be there with more than vaporware. Both companies are making strides, but people want to look under the hood, fast! It would be nice if the announcement and the delivery weren\'t so far apart (we waited a long time for the Kintera docs after the NTEN call - they should have either postponed the call, or gotten the docs done sooner.) Speaking of Kintera ...
  • Kintera: try to catch up - Of course, the big 800 pound gorilla has become Salesforce - and their platform is becoming what people are measuring against. The Kintera API looks, quite honestly, seriously wimpy in comparison to either Convio or Salesforce. But then again, they are better than Blackbaud, which still has no open APIs (that is, ones that are free), let alone anything else. (\"Johnnie, can you spell \'data silo\'?\" ... \"b ... l ... a ... c ...\")

Continue Reading

Some lessons from the \"enterprise\"

On 11 Oct, 2007 By mpm

One of the areas of technology I watch is the world of \"enterprise\" IT. Think big companies, lots of bucks, lots of boxes. Some nonprofits fit into this category, but most nonprofits that I work with don\'t. I think sometimes, interesting things come out of that watching. For instance, Gartner\'s top 10 technologies to watch in 2008. There\'s some great stuff there, like virtualization, social software, green IT, etc. The striking thing: open source software wasn\'t on their list. It was on their list for a few years, and has now fallen off. And that\'s because in the enterprise world, for the most part, open source software is a given part of the mix. It\'s not a strategic technology to watch, it\'s old news. It is nice to see that \"Green IT \" tops Gartner\'s list. I really do hope that the CIOs of the world pay attention.

Continue Reading

NOSI Primer, released finally

On 09 Oct, 2007 By mpm With 1 Comments

I\'ve been working hard on one project in particular over the past few months: the updating and expanding of the NOSI primer: \"Choosing and Using Free and Open Source Software: A Primer for Nonprofits\". It\'s taken a while, has some nice new features, and, basically, I\'m pretty proud of it.

Continue Reading

{.post-icon .standard}

Getting Naked: Being human and transparent

On 09 Oct, 2007 By mpm With 2 Comments

If you are new to this site, you might want to read more, and subscribe to my feed. (Photo by rob_pym) I\'m hosting this week\'s Carnival of Nonprofit Consultants! Sorry it\'s a day late. The topic I chose was a tough one, too: I asked people to talk about mistakes they make with clients, and how they deal with them. There weren\'t many takers on this one, but there are some real troopers out there in the nonprofit consultant blogosphere, so there\'s some great stuff to talk about. And, since it\'s a tough topic, I\'ll not take myself off the hook. The theme in all of the blog posts, and in my own approach to this issue, is transparency: we do all make mistakes, and what\'s important is being honest about them. Mark, of Sea Change Strategies, talks about five mistakes he\'s made in working with clients, includes things such as ignoring internal organizational dynamics, and getting too involved, or not involved enough. He has some really good thoughts worth considering as we work with clients over time. Michele Martin gets naked, that is, practices her transparency preaching, and talks about mistakes she\'s made, and goes into detail on her approach to dealing with mistakes. Michael Stein talks about three different kinds of mistakes that one can make in technology consulting, including some errors of process, and the \"Hot Dog\" syndrome. A great read for anyone who does development for clients. I\'ve made most of the mistakes outlined in all three of these brave blog postings. I think it is easy to feel like we\'re the \"experts\" so we shouldn\'t make mistakes. It\'s all to easy (and I\'ve seen it often) to do anything we can to avoid being wrong about something. Transparency, or, Michele Martin\'s phrase, \"getting naked\" is key, I think. Being transparent with clients about our own processes and weak points, and where we may falter, and, most importantly, being transparent and honest when we make mistakes, is what can make the difference between happy clients and ones that wish they\'d picked someone else.

Continue Reading

{.post-icon .standard}

How not to treat an open source user community

On 04 Oct, 2007 By mpm With 3 Comments

I\'ve been using activeCollab for a few months now. It\'s designed as a basecamp clone. It has some things missing, for sure, but it has been useful to me. I had hoped to more actively use it once the new version came out. However, that won\'t happen. activeCollab is going commercial. It seems to me that they could learn from the other successful projects out there - the really successful projects are supported by a wide variety of methods, whether it be a support model, a nonprofit foundation model, a hosted model, and others. In fact, pretty much every open source project that has gone commercial, or had a change in license, caused a fork, pretty much killing the original (like Mambo, or XFree86.) They have had an active user community, many of which, I imagine, are going elsewhere. Luckily, there is ProjectPier, which is a fork of activeCollab, and will remain open source. I\'ll be moving from activeCollab to ProjectPier soon.

Continue Reading

{.post-icon .standard}

Social Networks and Digital Sharecropping

On 01 Oct, 2007 By mpm With 5 Comments

I was reading Deborah Finn\'s curmugeonly post about Facebook. I have been having curmudgeonly thoughts about social networks in general. My curmugeonly thoughts fall into three basic categories of sucks: time suck, content suck, privacy suck. Time suck: Social networks are a time suck. Signing up for new ones, making profiles, adding friends, adding applications, etc. etc. And, yet another login and password. At least the content-focused social networks, like del.icio.us, or flickr, or my personal favorite, our own Social Source Commons, there is some there there. I have reached social network burn-out, and I refuse to join another one, unless there is something truly compelling, and something I could not accomplish in any other way. Content suck: And why do the for-profit social networks exist, when you really get down to it? Nick Carr, one of my favorite smart dudes, calls it digital sharecropping:

What\'s being concentrated, in other words, is not content but the economic value of content. MySpace, Facebook, and many other businesses have realized that they can give away the tools of production but maintain ownership over the resulting products. One of the fundamental economic characteristics of Web 2.0 is the distribution of production into the hands of the many and the concentration of the economic rewards into the hands of the few. It\'s a sharecropping system, but the sharecroppers are generally happy because their interest lies in self-expression or socializing, not in making money, and, besides, the economic value of each of their individual contributions is trivial. It\'s only by aggregating those contributions on a massive scale - on a web scale - that the business becomes lucrative. To put it a different way, the sharecroppers operate happily in an attention economy while their overseers operate happily in a cash economy. In this view, the attention economy does not operate separately from the cash economy; it\'s simply a means of creating cheap inputs for the cash economy.

It\'s a big chunk to digest, but it makes perfect sense. As I said in a post a while back, I know that Facebook is getting far more from my time spent on Facebook than I do. They own my profile, and whatever time I spend adding content. It\'s not really mine, and I don\'t like that. Privacy Suck: Not so long ago, there was a little hiccup in Web 2.0 goodiness. A new social networking site, called \"Quetchup\" spammed (without permission) the contacts of people who signed up for the site. That\'s because a lot of the social networking sites allow you to find other people on their site by giving them your gmail username and password, or your email contact list. There is no question that the social networking space is evolving. But I\'m not going to join another social network unless: 1) It is truly compelling on a content level, and provides a way to do things with content that is impossible otherwise, or, 2) It uses OpenID, 3) It has an open social graph, and 4) I have ownership and control of my own profile data. When all of those happen, I\'ll be the first to sign up.

Continue Reading

Let your voice be heard

On 01 Oct, 2007 By mpm

I\'ve been writing a surprising amount about nonprofit CRM tools lately. It\'s such an interesting space, and there are some really intriguing things happening with software in that space. NTEN is trying to get a handle on all of this, and find out what people use, and how much they like what they use. I can\'t wait to get my grubby little fingers on the data on CiviCRM and Salesforce. So, let your voice be heard! Fill out the survey.

Continue Reading

Tasty nuggets

On 01 Oct, 2007 By mpm With 1 Comments

A few things have come across my desk while I was on vacation, so I thought I\'d collect them here:

  • Of course, there are new Web 2.0 tools that come out every single day. It\'s a bit staggering, sometimes. I am waiting for this bubble to burst, too, but until then:
    • Timebridge - this seems like a pretty useful scheduling tool. The cool thing is that it integrates with GoogleCalendar. I just did a trial meeting scheduling - and it worked pretty well. One note, though - the increasing number of new web applications that are interfacing with google, meaning that there are companies out there with my google password, is a bit troubling. I wish there was a way to avoid that, and still get the integration.
    • DonorChoose - I am both fascinated and horrified by this site. Basically, the cool thing is that you can choose which school projects to fund - so if I\'d like kids to have more hands-on science experiences, I can fund projects to buy things like microscopes ... wait, what?? Microscopes? What happened to our school system that an organization is formed to provide a place for thousands (yes, thousands) of projects for school kids? In school. So they can learn. WTF? But, anyway, if you want a good cause, this is one. And the concept is one that is increasingly prevalent: donors get to choose exactly where there money goes, and there is some competition between worthy projects. I\'m still on the fence about this concept in general.
    • Razoo - here\'s another Change.org for you. (There is a post forthcoming where I vent my social networking curmudgeonness.)
  • Building open social networks - This is a great article on O\'Reilly Radar about opening up the \"social graph\" - it\'s worth a look.
  • My online identity score is 9/10. That\'s kinda cool.

Continue Reading

{.post-icon .standard}

Has Apple become evil? No, but they are getting stupid.

On 30 Sep, 2007 By mpm With 3 Comments

I had decided, a while back, not to buy an iPhone. Too expensive, for one thing. I like my 60G iPod that I\'ve had for a while, and although I tire of lugging around three electronic devices (cell phone, Palm, iPod), and that my current phone is about to fall apart, the cost and the fact that it was so new made me decide not to go for it, even though AT&T is my carrier. But then,  Apple dropped the price \$200, and it made me ponder. But nope. No iPhone for me. Why? This is why. I will not be buying an iPhone until they sell an unlocked version that doesn\'t need to be hacked to use third party applications. The move of Apple to use software update to break hacked and unlocked phones is somewhat ironic, given the attitude of Jobs toward DRM, and the open source basis for OS X. Jobs understands that DRM doesn\'t work, and doesn\'t help sell music. He should understand that the same thing is true for iPhones. The good news is that eventually, hackers always win. A few years down the road, when successive updates of the iPhone get hacked, they will give up, and open it up. And, maybe they\'ll even figure out that open will likely make them more money than closed. But for all of Steve Jobs smarts, sometimes he can be pretty dumb. So what am I going to replace my current phone with? I don\'t know yet, but whatever it is, it will be unlocked.

Continue Reading

Forgot to say ...

On 25 Sep, 2007 By mpm With 1 Comments

I\'m on vacation/fiction writing retreat this week. So, no blog posts from me here until I get back to work on October 1st. But I might write an entry or two on my personal blog, if you\'re interested.

Continue Reading

{.post-icon .standard}

How to find out about free and open source software

On 20 Sep, 2007 By mpm With 1 Comments

You\'ve been told that insert_cool_open_source_software_project_here might be the ticket for a specific function or system you\'d like to implement in your nonprofit organization. Or you\'re just curious about projects you\'ve heard about. How do you go about finding out whether it\'s the right software, and whether the project has a healthy community, since you don\'t want to adopt a project that doesn\'t?

  • Check out the website. Make sure that the features that it outlines there match your requirements. See if they have good documentation.
  • Check out the forums of email list archives of that project. How busy is it? How easily or quickly does it seem that questions get answered?
  • Look at the \"download\" page (or \"releases\"). When was the last release? How much time generally passes between major or minor releases? (Minor releases are, for example, when a project goes from 2.2.3 to 2.2.4. Depending on project, going from  x.2 to x.3 might be a major or minor release. Going to a x.0 release - for example from 2.x to 3.0 is always major.) Rule of thumb: projects that haven\'t had minor releases in a year or more are definitely in danger of becoming projects that are no longer under development.
  • Look at ohloh.net - they have great info on most projects - how many developers, lines of code, how active development activity is.
  • ohloh.png
  • Send queries to nonprofit tech lists for experiences and information, like nosi-discussion, nten-discuss, riders-tech, and others.
  • Google it - you might find articles and reviews that might be helpful
  • Try it out. These are almost always free to download and try out - this is easier for some projects than others. Luckily, most web project have online demos, which will give you a feeling for the software without having to spend too much time configuring a server or webhost to use the software. Many standard virtual hosts have \"one click install\" or \"fantastico\" - which makes it easy to try out some kinds of web applications.

Continue Reading

What do web stats mean, anyway?

On 17 Sep, 2007 By mpm With 10 Comments

There is an interesting discussion happening between Holly Ross, soon to be ED of NTEN, and Allan Benamer, about web statistics, and whether or not nonprofits should be \"transparent\" and publish their web statistics. Allan\'s argument is that because NTEN is in a leadership position in the field, it should lead in showing transparency by publishing its web stats. And, he thinks that NTEN should be responsive to him, as a member, in asking to publish web stats. Holly\'s argument is, basically that web stats don\'t equal accountability. The question I want to ask is, what do web stats really mean, anyway? For organizations, web stats are useful indicators of how many people are being reached by their message, the geographical spread of the visitors and whether or not a specific campaign was successful in driving traffic or creating actions (like donations, or letters, etc.) It is an internal assessment tool which helps organizations figure out what parts of their online strategy are working, and what parts might need tweaking. As some sort of measure of accountability, raw web statistics (this site got x visits and y pageviews in t timeframe) mean zilch. Nothing. Nada. Just because organization 1 gets 45,000 unique visits per month, and organization 2 gets 3,000 means nothing in relationship to the impact that organization has in the world, or in relationship to how it uses its resources. Organization 1 could be spending all of its money on its web presence, and none on its mission, organization 2 could be doing just the opposite. And the mission of the organization matters too. Even for NTEN, which is extremely web-heavy in its mission, raw visit numbers will mean nothing related to how well it is doing its job. The idea that web stats = some measure of nonprofit accountability is a result of a mindset that suggests that web presence should be the central part of a nonprofit\'s communications strategy, and that raw numbers of visits has some relationship to how well a nonprofit works. For some organizations with some limited kinds of missions, this may be the case. But for the vast majority of nonprofits out there, web strategy is a small part of their communications strategy, and the numbers of people that visit their site bears little or no relationship to how well they do their work, or what they do. And, I actually hope that doesn\'t change. I don\'t think we want homeless shelters, food pantries, mental health organizations, etc., to care a whole lot about how many hits they got in comparison to similar (or different) organizations. I don\'t want to start a race to the top of the Nonprofit 25 - where organizations start spending more time worrying about their position on that list, and less time feeding people. Allan says:

Granted, web site stats will not tell me anything about how many hungry people a nonprofit feeds. How odd is it then to teach Google Analytics to nonprofit techies but then say that site statistics had nothing to do with a nonprofit's mission? Why bother having a web site at all? Properly used, web sites are more than just a payment solution for credit card bearing donors. They can be used for a nonprofit's mission and that is why nonprofits should exercise transparency on web site analytics.

How does giving resources to nonprofits to help them understand how to use web stats to do internal assessment of web strategy inconsistent with choosing not to publish raw web stats? Asking NTEN to show leadership by publishing web stats is to suggest that NTEN would think that publishing web stats is a useful measure of nonprofit accountability. Holly doesn\'t think so, and I don\'t either.

Continue Reading

OpenOffice.org to get a boost

On 13 Sep, 2007 By mpm

I\'ve been spending a lot of time with OpenOffice.org lately. I\'ve been running OOo, as it is often abbreviated, for many years now (I used StarOffice before OpenOffice.org was created.) I have used it everyday, to do everything (all of my spreadsheets, worksheets, articles, presentations, I used it to write a novel, I used it in seminary for papers, etc., etc.,) for at least 4 years. I\'ve not owned MS Office in a very long time. Lately, I\'ve been running the 2.3 Release Candidate to help with QA, which has been fun (and 2.3 looks mighty good - especially with the improvements to Base.) I wrote an article on OpenOffice.org for LASA\'s knowledgebase, and I wrote another one on Base specifically (Base is the database component to OOo, new in 2.0, and pretty good, and improving fast.) that will be published in Linux Identity Magazine. I hope to start doing OpenOffice.org training soon. I happen to think that unless an organization has deeply invested in developing custom Access databases, there aren\'t too many reasons left not to switch to OpenOffice.org. Actually, even if they have, for word processing, spreadsheets, and presentations, it\'s really great. It\'s stable, feature rich, uses open standards, reads and writes MS files, and, did I mention it\'s free? No administration fees, no license checking, no running out of licenses for larger organizations, nothin\'. Download it and put it on every desktop and get rid of that license manager thingy. In talking with organizations that are using it - adoption issues for staff seem to be fairly minimal (my partner, a non-techie writer, uses it everyday, with no complaints.) Of course, like all open source software, it is \"free as in kittens\" - but this particular kitten is pretty grown up, and already spayed and litter trained. So, here\'s the great news: Hot on the heels of Microsoft missing the ISO boat, IBM is lending their weight to the OpenOffice.org suite. They are having 35 (!) programmers work on OOo. It\'s not only that they are going to be contributing to the project - but remember the old adage \"no one ever got fired for buying IBM\"? IBM\'s reputation is bound to help increase adoption of OpenOffice.org. More adoption means more developers involved, more users helping, more resources available. Outside of the US, OpenOffice.org adoption is growing fast. I imagine that will begin to happen here as well. (In the spirit of full disclosure: IBM has given grants to NOSI in 2003 and 2007 for the NOSI Open Source Primer.)

Continue Reading

{.post-icon .standard}

Economically, open looks better than closed

On 13 Sep, 2007 By mpm

An interesting study was released yesterday by an organization called the Computer and Communications Industry Association (with heavyweight members like Google and Microsoft) which shows that fair use exceptions to copyright generate more economic benefit than copyrights themselves. Here\'s a tidbit of a Infoworld report about the study:

[ By one measure -- \"value added,\" which the report defines as \"an industry\'s gross output minus its purchased intermediate inputs\" -- the fair use economy is greater than the copyright economy.]{#articleBody} Recent studies indicate that the value added to the U.S. economy by copyright industries amounts to \$1.3 trillion, said Black. The value added to the U.S. economy by the fair use amounts to \$2.2 trillion. The fair use economy\'s \"value added\" is thus almost 70% larger than that of the copyright industries. The \$4.5 trillion in annual revenue attributable to fair use represents a 31% increase since 2002, according to the report, which claims that fair use industries are responsible for 18% of U.S. economic growth and almost 11 million American jobs.

[ So, if fair use adds more economic benefit than copyrights - what would open source do? Well, we have some data from Europe:]{#articleBody} []{#articleBody}

[FLOSS potentially saves industry over 36% in software R&D investment that can result in increased profits or be more usefully spent in further innovation.]{.artText}

ASAY: Importantly, these savings apply to everyone, not merely open source companies/developers. Open source isn\'t biased in distributing its benefits.

[ ...• Increased FLOSS use may provide a way for Europe to compensate for a low GDP share of ICT investment relative to the US. A growth and innovation simulation model shows that increasing the FLOSS share of software investment from 20% to 40% would lead to a 0.1% increase in annual EU GDP growth excluding benefits within the ICT industry itself -- i.e. over Euro 10 billion annually. ]{.artText} []{.artText}

[The evidence seems to be growing. At least on a large scale, open is economically better than closed.]{#articleBody} Update: Nick Carr thinks the fair use study is \"a crock.\" He has some good points

Continue Reading

Tidbits

On 10 Sep, 2007 By mpm With 1 Comments

I have a few little tidbits, each don\'t make enough for its own post, so here goes...

  • Nicholas Carr (the Rough Type smart dude) tried out Adblock Plus, and has some very insightful comments about it. He even asks \"What would Jesus do?\"
  • O\'Rielly has a new online series about Women in Technology, with some really great articles. Worth a read!
  • AgencyByte has a great article on how to prevent scope creep.
  • Although the breaks I take during my work day don\'t look like this, I liked this cartoon.
  • I\'ve been reading Matt Asay\'s blog on open source business models, and he has an interesting post which suggests that writing documentation is a good marketing strategy!
  • OK, no, I\'m still not convinced, but here\'s an interesting take on Second Life from Frogloop. Also, they have a very interesting ROI calculator for social networks that is worth a look.

Continue Reading

{.post-icon .standard}

Microsoft Fails to get ISO fast-track for OOXML

On 07 Sep, 2007 By mpm With 1 Comments

For those of you that pay attention to open standards, this is old(ish) news. Earlier this week, ISO, the International Organization for Standardization, rejected Microsoft\'s bid to fast-track OOXML (Office Open XML) to standard status. What this means is that MS will have to take all of the varied input from the ISO bodies, and go through a second vote early next year. Microsoft thinks that it will win approval, but that is far from clear. (If you read that link, which is basically a copy of a press release from Microsoft, you\'d think they had it all sewn up. In fact, that is far from the case.) Office Open XML is Microsoft\'s XML-based file format which is now native in Office 2007. Instead of adopting the already ISO approved Open Document format, it attempted to get through ISO a standard that, among other things, depends too much on non-standard, non-publicly available legacy file formats. Which, of course, kinda defeats the purpose of an open standard. Microsoft is in an interesting place with their cash cow, Office. They have increasing competition from OpenOffice.org, Google Apps, and, on the Mac, iWork. A lot of governments are demanding that document formats be open standards, so it is important for MS to be able to get OOXML through ISO. I\'ll keep you posted.

Continue Reading

{.post-icon .standard}

Reaping the Benefit of Open Platforms

On 06 Sep, 2007 By mpm With 1 Comments

One of the cool things about free and open source software is that often (not always, but often) they provide an open platform for add-ons. As a full-time user of both Firefox and Thunderbird, I\'m really enjoying the benefits of these open platforms, and the immense creativity of people who create add-ons. And it\'s all free! A few Firefox extensions that I can\'t live without include a new one I discovered recently, called \"AdBlock Plus.\" This is the best thing since sliced bread. It blocks banner ads, Google adsense ads, stupid dancing aliens for mortgages, etc. I love it. I know a lot of people get revenue from ads, and I sorta feel bad promoting AdBlock, but the truth is, I never click on ads, so no one ever gets any revenue from my actions anyway. It\'s nice to have clean, ad, free pages, and especially without the distracting moving ads. Plus, pages load faster without ads. withoutads.pngwithads.png The other Firefox extension that I use a lot is Google Toolbar. It\'s great to have easy Google tools at my fingertips. For Thunderbird, the two I\'ve been trying out include XNote, which is kinda fun, it  allows you to add sticky notes to email messages. xnote.png Lightning, which is a Sunbird calendar plug in, that gives me a calendar integrated with Thunderbird. light.png There are tons of other add-ons for all of the Mozilla Suite applications. I\'m trying out some new themes soon.

Continue Reading

{.post-icon .standard}

Convio will join Kintera and Blackbaud as a publicly traded company

On 01 Sep, 2007 By mpm With 2 Comments

You\'ve probably heard the news, and I\'m taking a break from my break to write about it. Convio has registered to go public. This means that the \"big three\" nonprofit CRM/Fundraising/Advocacy vendors will all be publicly traded companies, and thus completely beholden to their shareholders to maximize profit. Unlike Salesforce (also publicly traded, where nonprofit paid accounts are a tiny, tiny minority of their earnings,) every single penny of money that these corporations earn come from nonprofit organizations. Thus, every single penny of their income comes from donations that nonprofit organizations raise to, theoretically, fund the missions of their work. OK, so I\'m going to sound like a broken record. But, hey, why not? How about some community-owned, community-driven free and open source options? How about options where investment feeds back and benefits everyone, instead of a few people? How about bigger bang for the donation buck, where the money that nonprofits spend on CRM/Fundraising apps goes to options that just get better and better - a rising tide that truly lifts all boats? This is neither impractical nor rocket science. All it takes is leadership, collaboration, and, most importantly, will.

Continue Reading

{.post-icon .standard}

Varied and sundry before a brief break

On 30 Aug, 2007 By mpm

I\'m taking a brief 4 day weekend - not that it would be that noticeable on this blog, since I don\'t always post consistently. That is actually one of the things I\'m going to try and change - to set up to do a post every weekday. We\'ll see how that goes. Before I took off, I wanted to mention a few things that have been on my mind (and on my plate.) First, some of you might know that the new version of the NOSI primer has been in the works. It\'s new and updated for the realities of the nonprofit technology and free and open source worlds of 2007. The primer, which will show up in just electronic form, also has a very cool implementation of a great open API that we\'ll be crowing about soon (my lips are sealed right now.) I keep discovering new and exciting free and open source web platforms. One of which is called Elgg. Elgg is a very cool community-building and e-learning tool - it\'s got the social networking combined with features like forums, etc. And further, there is an amazing integration of Elgg with Mediawiki. I wish I had an excuse to install and test it - but it really requires a purpose for bringing community together. I could think of several interesting implementations in the nptech world that would be fun (like it would make a great e-learning platform for, say, circuit riders, or folks like that.) Laura Quinn has a great blog entry about software \"satisficing\" - that is that sometimes we want to maximize the features that a particular software package has, instead of finding the one that works for us. It\'s very zen, and a great read. I\'ve been doing a lot of technical writing - I just put the finishing touches on my third of three articles for LASA\'s ICT Hub Knowledgebase, I\'m writing an article on Open Office Base for the next issue of Linux Identity Magazine, and, of course, I\'ve been busy writing the NOSI primer. I also have been doing some interesting client projects. So I\'m very busy, and enjoying what I\'m doing a whole lot! Oh, and my consulting practice has a new logo, done by the fabulous ALR Design!

MetaCentric
Logo{width="314" height="190"}

Continue Reading

Mission Statement

On 30 Aug, 2007 By mpm With 1 Comments

I have not at all been tempted by the 31 days to a better blog challenge. Not because I don\'t want my blog to be better, it\'s just that I don\'t have the time right now. But, I have been following Michele Martin\'s work on The Bamboo Project Blog with some interest. There are a few pieces to that challenge I might take up, on occasion. One of them, I\'ll do now. A few days ago, Michele posted her blog\'s mission statement. I figured it was a good time to think about and articulate mine. The tag line for this blog is \"Conscious,  mimalist, neo-luddite perspectives on nonprofit technology.\" The mission of this blog is to help me, and those who read the blog, think more deeply about how we use technology. To get underneath the \"conventional wisdom\" of nonprofit technology, and keep asking \"why\"? It\'s like that 3 year old, who just keeps asking why, after each explanation. I want to get to the core, to really make sure that our means and our ends are in sync. And, a secondary mission is to educate people about technology that I think is interesting and useful. My blog and my advising practice don\'t quite have the same mission. My advising practice is much more about educating and helping people with concrete technology tools, and concrete projects. I hope that in the process of doing projects for clients, I am able to ask those deep questions, and help them ask \"why\" much more often. But my role in that context is much more around helping to solve specific problems, or educate in specific ways about technology options.

Continue Reading

Vendorspeak

On 24 Aug, 2007 By mpm With 1 Comments

One of the things I\'ve noticed recently is that my blog is getting the attention of software vendors. I guess that\'s a good thing. Maybe it means I have \"arrived\". Probably it just means that when the \"Social Media Director\" or the \"Goddess of Communication\" arrived in their office in the morning, they ran their standard set of google blog and technorati searches, and voila, there I was. It was, a while ago, part of my job to build technology solutions for people. It was also part of my job to give advice where it was appropriate, but I have come to realize, in my current position of being apart from building things, that I had a bit of myopia, as all builders and vendors do. We like what we build/sell (generally, I\'m sure there are exceptions.) We think our particular products or service is the best around, or, at least, provides our clients with some unique value. I had good intentions, like virtually all software vendors do. Like many who work in this sector, I cared more about the missions of my clients than I did about my own income, although I also needed to put food on the table. But, I was myopic anyway. It\'s the standard \"if I have a hammer, every problem looks like a nail.\" I wanted to figure out how to make my product solve every problem that my clients had - or, if I couldn\'t, I wanted to figure out how to build/install/integrate something that could. Vendors, no matter what their intentions and points of view do have a particular way of speaking. \"Elluminate is a leading provider of live Web conferencing and eLearning solutions for the real-time organization.\" and \"... hundreds of NonProfits have already found that ReadyTalk is a good fit for them both technologically AND financially.\" and \"We've designed Yugma to work seamlessly between Mac, Windows and Linux.\" you the picture. In fact, if you are a nonprofit, you\'ve spent time wading through that crap (and believe me, I\'ve created my own healthy share of vendorspeak.) A while ago, I blogged about the \"scarcity mentality\" - the idea that the pie is finite, and it has to be split up. So, of course, everybody is fighting tooth and nail, bit and byte, trackback and comment (and even dollar and cent), to get their piece. There are some trends that make it seem that some vendors are beginning to get the picture that we can all work together - open APIs seem to be on the rise. That makes me happy. My one request is that vendors who comment on this blog take a moment to step back a bit from their myopia, and look at the ways in which they can contribute to a vibrant, active ecology of choices, rather than fight for their own little piece of the pie.

Continue Reading

{.post-icon .standard}

Free software and sustainable computing

On 23 Aug, 2007 By mpm With 2 Comments

I remember in the late 90s, in the heady days of getting organizations up to speed with technology, I would suggest to organizations that they add in their budget replacing 1/4 to 1/3 of their computers each year (instead of the much more common practice of replacing them all at once every 5 or 10 years when a grant happened.) This was for all the good reasons: computers are cheap, support is expensive, and it would cost more time and money to diagnose and fix a computer than replace it - so replacing computers on a regular schedule would actually decrease IT costs. Well, all of that is true. But in this newer era, when we think that cost is more than just dollars, but we also have to think about the hidden cost of all of those toxic chemicals, fossil fuels, and water that goes into manufacturing computers, as well as the bulging landfills all over the country. So I\'ve been thinking a lot about the role of free and open source software in environmentally sustainable computing. Some of the biggest reasons to replace computers is not as much the hardware failure issues, but software bloat and cruft, and planned obsolescence. You can\'t run Microsoft\'s Vista on a computer that is more than a year or so old. Computers that now run Windows 2000 (there are plenty of them in nonprofit offices, I\'m sure) probably can\'t even run XP, let alone Vista. But computers of that vintage can pretty happily run Ubuntu Feisty (the current Ubuntu version). And older computers running Linux make very handy single (or even multiple) purpose servers - file servers, backup servers, dhcp servers, routers and firewalls, print servers, etc. The great thing is that converting to Linux won\'t just help environmentally - Linux has lower IT costs - lower support costs, and no software acquisition costs. If an office converted from Windows to Linux, they could keep their hardware much longer (five years easily), and have much lower IT costs, thus, in the end, creating a more environmentally and economically sustainable office. Of course, there are caviats. There are software options that don\'t exist yet, there are hardware incompatibilities, but these decrease every year. Sometime in September, NOSI will be releasing the new version of the primer on open source software for nonprofits. I\'ll be announcing it, for sure. There will be some much more detailed information that will help make it easier to figure out if it is right for you.

Continue Reading

The more things change ...

On 21 Aug, 2007 By mpm With 1 Comments

One of the great things about the nonprofit technology community is that the community, on the whole, has a great respect for women, in whatever roles we play - whether it be geeky not. The exceptions to this in my experience have been very, very few and far between. However, take one little step outside of our warm and fuzzy community into the larger technology community, and things change. Unfortunately, the open source community seems to find ways to ridicule, degrade and and belittle women quite often. Linux Journal ran the following advertisement by a company called \"QSOL\": qsol.jpg{width="238" height="266"} And, it got 2100+ diggs, with the title \"Best. Ad. Ever.\" It ran in 2000, with a lot of uproar, and they promised never to run it again. Right. In addition,  Linux Journal has a column, called \"Tech Tips from Gnull and Voyd\" with quotes like:

Howdy.  My husband is Chester Gnull and I\'m Laverta Voyd, and I\'m the lady to light a way for all you sweethearts out there who do fancy stuff with Linux.  Me and my husband\'s gonna be bringing you tech tips just about every month now.  ... I don\'t know nothing about Linux.  Chester, he\'s the smart one, but he\'s not much of a talker.  That\'s why I\'m here.  ...

One wonders how many bad stereotypes they can fit in one column? Anyway, if you read Linux Journal, please tell them how you feel. I did.

Continue Reading

{.post-icon .standard}

The search for good web conferencing

On 17 Aug, 2007 By mpm With 22 Comments

I decided, perhaps rashly, that one way of exposing people to, and training people on, open source software, was by doing web conferencing. I thought it would be a good endeavor to start with. One that could include free webinars, as well as paid training. So, once I decided that, I realized I needed to find the right tools. In my searches for tools that would allow me to start this endeavor, I had several things I wanted:

  • Possibility of showing slides and sharing my Linux and Mac Desktop
  • Audio conferencing (two way)
  • Chat capabilities
  • Clean, professional, bug-free interface
  • Low (or no) infrastructure costs or setup
  • Truly cross platform
  • Inexpensive (but not necessarily free.)

I felt I needed all of these features to make this successful. However, I have realized that it is presently impossible have all of this in one package. I have evaluated a number of options, and every one of them comes up short in one way or another.

  • ReadyTalk - this had been the tool I thought I\'d use. It is totally cross-platform, includes audioconferencing, has chat, etc. Its only drawback is that it\'s not inexpensive. It\'s \$.24 per person/minute for web conferencing, plus \$.15 per person/minute for audio is very tough to swallow. Even the unlimited plans (which start at \$50/month, or possibly discounted) have pricing for audio, which I hadn\' t realized. It was this audio pricing that drove a stake in the heart of my ReadyTalk plan. Note: These rates are not for nonprofit organizations - they are the standard rates. Nonprofits are charged much lower rates (see comment below.) Unfortunately, I don\'t qualify for those rates.
  • Web Huddle - The only other one I can find that at this time offers the possibility of sharing a Linux desktop. It does, apparently, do audio via VOIP. In my tests it was buggy (some parts of it just didn\'t work), and the interface was still a bit crude. It is a free service right now, which is good, and the software behind it is open source, which is even better.
  • The others I assessed include DimDim, Yugma, and Adobe Connect. They all are certainly interesting, and DimDim and Yugma are free. (I love that DimDim is using Joomla as their site CMS). They are all cross-platform for participants, but none of these can show Linux desktops, and some (like Adobe in particular) can\'t even run the presenter software on Linux.
  • I also looked at a system that The Gilbert Center has been using. It\'s quite interesting, and it deals with the audio over phone issue (which is a major snag) by dispensing with two-way audio entirely, and just using one-way audio. In looking more deeply at it, however, I\'d have to do some major technology infrastructure implementation to get it going, which I\'m not ready for.

So what\'s the answer? I don\'t know. Technologically, ReadyTalk is, well, ready. My wallet, however, is not ready (even if discounted). WebHuddle is not technologically ready, but it may (or may not) end up being what I use for free webinars (either alone, or in combination with freeconference.com.) It is the only one of these options that is both free as in beer, and free as in libre, which is important to me. But it\'s buggy, and it feels like exposing people who aren\'t already totally sold on open source to open source using a project that\'s not ready for primetime is, well, one step forward and two steps back? (And, boy did I just fail the Richard Stallman test!) Perhaps not, I\'m not clear. I might try using Yugma on my Mac desktop (I actually haven\'t tried that yet - I don\'t know whether it\'s supported,) because the interface on Yugma is clean and nice, and it seems bug free. But this has also made me re-assess this whole endeavor. Is this what I really want to do? Is there an audience out there? I\'m stepping back and thinking a bit more about this.

Continue Reading

Tips for reducing IT costs

On 16 Aug, 2007 By mpm

Nicholas Carr, who writes the blog Rough Type, is a smart dude. Although he writes about the technology field as a whole, and seems very tapped into the for-profit world, he has lots of words of wisdom I think we can learn from. He has a recent article in \"Director Magazine\" which is a UK-based business magazine (no, I don\'t read it, I just got the link from his blog,) entitled 10 tips for reducing burgeoning IT costs. Most of the 10 are applicable to nonprofit organizations (I\'d skip the \"Offshore work\" tip.) The one I love the best: Procrastinate. We all don\'t need to be on the bleeding edge to get stuff done.

Continue Reading

Kintera Connect

On 15 Aug, 2007 By mpm With 1 Comments

I listened in on the call with Kintera folks about their new platform, called Connect. I was mostly curious about how open this platform will be, and what the future holds for them. I have become fascinated by the ways the CRM/Fundraising space is changing so rapidly. Basically, Kintera is taking directly from Salesforce\'s playbook. There are two initiatives that they have that I\'ll talk most about, their \"Connect\" initiative, and their data warehousing initiative. These are, for pretty obvious reasons, the most interesting to me personally. They will also be doing some serious UI overhauls, and upgrading their CMS. They also are opening a new data center, as well as bringing Akamai technology into the mix. The Connect platform is a set of APIs, starting with the contact and payment sets of entities, that will allow access (via SOAP 1.1) to the data in the Kintera platform. Basically, third parties will be able to build applications which will allow two-way communication into the platform. The APIs will be without cost. The data warehouse initiative is to allow their customers access to large amounts of data for reporting and data mining. It seems like it will start out with a local query system, then will be opened up to allow third party development of data analysis tools. That part looks very interesting. A couple of annoyances: the documentation for the APIs aren\'t up yet, and the sample code they are going to publish is in C# and Java!  Now I\'m sure that there are a lot of large Kintera customers that might be implementing other applications that will be written in C# and Java, but it seems to me that this is, in fact a pretty big red flag that they really don\'t have a feeling for the technology that the sector is using. Code published in PHP and Python would probably get a lot more people up to speed and interested in building stuff that will integrate with the things that a lot of nonprofits really use. I mean really, how many nonprofits have stuff written in Java? Small minority, I\'d bet. (I guess the C# would be useful for the Windows crowd.) On the whole, though, I applaud them for seeing the light, and opening up their platform. It will be interesting to see where this leads them.

Continue Reading

Technology providers and Linux

On 12 Aug, 2007 By mpm With 1 Comments

In the course of working with some clients, I have been in the process of trying to find technology providers, specifically, server, desktop and network support organizations, that support Linux. Several years ago, they were very few and very far between - mostly individuals who focused solely on Linux. Now, there are many more, and traditional Windows shops are beginning to either add staff who know Linux, or learn it themselves. But there still isn\'t a lot out there. At least in Massachusetts, the majority of nonprofits work with network support people who don\'t focus totally on nonprofit organizations (there are some wonderful exceptions, however, of companies that focus on the sector.) The good thing is that since the business world seems to be moving ahead much more quickly on Linux and FOSS adoption, companies that work in both the for-profit and nonprofit sectors are gaining Linux expertise - expertise that nonprofit organizations can benefit from. But I think more is needed. One of the interesting dynamics of any technology provider of any stripe is the way they choose the technology they will focus on and support. I think this is something that many nonprofits, especially those without in-house technology expertise aren\'t that aware of. No technology vendor, even the largest ones, can support everything. Most support only a subset (sometimes a very small subset) of the available options. This is because all providers start out with some personal experiences or biases, and most are too busy taking care of clients to spend lots and lots of time gaining new expertise in a broad range of topics - they need to focus. And even if they hire expertise, there has to be enough overlap for synergy to happen. Most of the time, for clients, this doesn\'t matter. Sometimes it does matter, both to an organization that might use that vendor, and to the vendor themselves in terms of viability of their business model. I came across this discussion of Linux distributions and their strengths and weaknesses in terms of vendors who might resell Linux. It was interesting not as much for its focus on the presence or absence of \"channel programs\" but for the way it characterized the qualities of the different distributions in terms of the business model of reselling Linux. In a sense, of course, if you are a technology provider and you install Linux on some servers or desktops in an organization, you are \"reselling\" Linux. But since most Linux distributions are free (as in beer), that\'s not really quite the way to look at it. So I thought I would take from their model, and instead, talk about distributions from the perspective of the needs of a small to medium-sized technology support organization (for profit or nonprofit) that serve nonprofits. I\'m really interested in helping technology providers get up to speed, so that the amount of support available for nonprofits using Linux (and open source in general) increases. If you are a provider, please feel free to email me if you want more info or help and support in moving forward. Ubuntu As this article states, Ubuntu is a very popular distribution right now, and Canonical is working hard to get Ubuntu in as wide a range of hands as possible. The basic philosophy of Ubuntu \"Linux for human beings\" is certainly one that makes sense for a lot of nonprofits, and it also has made Ubuntu the easiest distribution to set up and use, especially on the desktop. Ubuntu also has also focused a lot of effort on building community, and has, hands down, the most vibrant, helpful and deep community of any Linux distribution. They have mentorship programs, they are building regional networks, they have almost unparalleled bulletin board, email and IRC community support. This community is one of Ubuntu\'s great strengths, in terms of the ability to find helpful and sometimes instant support (via an incredibly active set of IRC channels.) And, Canonical also provides professional support. Canonical has a lot of connections in the nonprofit sector. In my opinion, it\'s a good distribution to start with if you are just beginning to learn Linux, and thinking about adding it to your business because of the vibrant and deep community that is there to provide support. But, as the article linked above says, they don\'t have an official affiliate program, so it will take some shoe leather on your part to build the business aspects. Red Hat Red Hat is the old standby, and is in the server rooms of many nonprofits all over the country. The old adage \"no one ever got fired for buying IBM\" which, of course morphed into \"no one ever got fired for buying Microsoft\" in the Linux world might be \"no one ever got fired for implementing Red Hat.\" Red Hat has a well-built business of providing enterprise level support for its distribution. It was the first Linux distribution to make it big in the business world. Its focus is on servers, and Fedora, it\'s \"community version\" does certainly benefit from Red Hat\'s development expertise and resources, but it doesn\'t have as vibrant a community as Ubuntu, for instance. However, because it is so common, and there is reasonable community support, and because of the strength of Red Hat, it might be a good choice, especially if a technology support organization works with larger organizations. Debian I\'m a real fan of Debian, and have been using it on the server-side for a very long time. It is a rock-solid distribution with what is arguably the best package management system. (Ubuntu is based on Debian). It has a vibrant user community. Debian is the favorite of most serious Linux geeks. The Debian community is dogmatic in their approach to licensing - nothing in Debian depends on software that is not truly free (as in \'libre\'.) Debian used to be one of the most difficult distributions to install, but that is no longer the case, so it is definitely a distribution that you could try as a Linux beginner (although beware that the community isn\'t going to be as friendly to newcomers as the Ubuntu community.) I think it\'s certainly a possible choice, especially if you\'ve got a philosophical approach that\'s resonant with Debian (a lot of activist technology organizations use Debian) and have (or can find) the expertise needed. But it is a bit less known and popular, especially with the presence of Ubuntu, which has most of the strengths (in a technical sense,) and none of the weaknesses (in a business sense) of Debian. CentOS I\'m only including this in my review because I have come across a surprising number of technology providers that have chosen to focus on CentOS. I had not heard of it until then. CentOS is a bit of an odd beast. It is Red Hat, with branding removed. In their words: \"CentOS is an Enterprise-class Linux Distribution derived from sources freely provided to the public by a prominent North American Enterprise Linux vendor. CentOS conforms fully with the upstream vendors redistribution policy and aims to be 100% binary compatible.\" So they take RHEL (Red Hat Enterprise Linux) and repackage it. So it has everything RHEL has except ... Red Hat support. Which, of course, is why most people buy Red Hat, instead of installing Fedora or Debian, etc. I\'m not sure what I think of this. I guess it makes sense at one level - here is a way for a vendor to provide basically Red Hat without having to either provide a markup in reselling it, or charge a nonprofit for it. What makes RHEL \"Enterprise\" is basically the support. That\'s about it. Debian or Ubuntu are just as \"Enterprise-class\" as RHEL without Red Hat behind it. So CentOS really isn\'t any different a choice - unless as an organization you are very familiar with Red Hat, and want to stick with it, but don\'t want to pay (or have organizations pay) for it. If you are new to the Linux biz, there isn\'t any reason I could see to adopt CentOS. Novell I think the article says pretty much everything that needs to be said about Novell, and SUSE Linux. They take directly from the Red Hat playbook, and have a community version called openSUSE. It\'s certainly a good option, although in the US, SUSE isn\'t very common. Other Distros There are, of course five bajillion Linux distributions of varied popularities. Any of which could make a reasonable choice for you as a technology provider (There is a good review in Distrowatch of the top ten distributions.) Only you really know what makes the most sense, given what you want to do with Linux, and what expertise you have on hand. And, luckily, once you\'ve learned some Linux, supporting other distributions isn\'t such a big deal.

Continue Reading

Linux ready for the desktop?

On 09 Aug, 2007 By mpm With 1 Comments

It\'s been 7 weeks of using Ubuntu 7.04 (better known as Feisty Fawn) as my primary desktop. I figured it was time to give my final assessment. Well, it\'s not final, but I\'ve come to what I imagine will be a steady state for a while. I\'m relatively happy, and, surprisingly enough, I don\'t miss using the mac everyday. Here are the good things:

  • Even macs get weird sometimes, especially if they\'ve been on for a long time. The only time I shut down this laptop is when I\'m taking it somewhere, which is relatively rarely. So I regularly have it on for many days at a time, with no noticeable degradation in performance.
  • I love apt-get/aptitude. 90% of the software I want to install I just install by saying \"apt-get install name_of_package\". Anything that is a requirement gets installed along with it. It\'s so incredibly easy. In general, installation and configuration of software has become much easier.
  • Open Office works better on Linux than on the mac, as does thunderbird and firefox
  • For a few things (mostly system/network tools) there are some incredibly awesome options, amazingly good for free (as in beer) software, and better than software you\'d pay for. For some things, there are lots of relatively decent choices.
  • I have no problem getting just about all of my work done using Ubuntu.
  • My printer setup (Brother laser printer) was easier on Linux than on the mac

Here are the bad things:

  • The games available on Linux leave much to be desired
  • Getting proprietary codecs to play (MP3, WMA, etc.) can be a pain
  • XWindows (x.org) can be annoying to configure if you go beyond the most simple
  • KDE is a memory hog (I switched to xfce, and am thinking about other light window managers)
  • There are some serious holes in available software
  • Some things take a while to get set up - longer than on a mac.
  • Getting some hardware configured can be painful.

Now the ugly:

  • If you are used to integration between your email, calendar and addressbook, there is only one option (Novell Evolution) and it is not very good.
  • If that integration, with the added integration of a PDA is very important to you, don\'t even bother trying Linux on the desktop.
  • Regressions are on the rise (regressions are things that used to work, but break in new versions.)

So, overall, I like it, and I\'m sticking with it, with the exception of my addressbook and calendar. It was quite a shame to have to give that up, but it was either that, or make due with either not using a Palm, or modifying dramatically how I dealt with my PIM data. So I\'m stuck where I don\'t want to be (where lots of nonprofits are) with a data integration problem that I can\'t solve right now, and likely won\'t get solved for years. Evolution, apparently, is not under active development because Novell is focusing on Groupwise. The Mozilla Foundation supposedly has been working on an integrated email/calendar/contacts system, but they are spinning off Thunderbird, and Sunbird (the calendaring app) is coming along slowly. So an integrated app from Mozilla is years away, if it will happen at all (nevermind the palm part.) I\'m looking forward to Gutsy Gibbon, Ubuntu 7.10 coming out in October. I\'m sure I\'ll be upgrading. And, to answer the question posed above, the answer is, for me, obviously, a definite yes. But it appears, from my experience and others, that question actually has to be framed not as \"Is Linux ready for the desktop?\", but \"Is Linux ready for your desktop?\" And the answer to that question depends upon the unique combination of the type of work, your software and hardware needs, and your willingness to put up with certain things. (Although, one needs to be willing to put up with some things no matter what OS one picks - it just depends on what you want to put up with.)

Continue Reading

Joining the NTEN Board

On 07 Aug, 2007 By mpm With 2 Comments

Katrin, the Executive Director of NTEN, announced today that I\'m joining the Board. (So now\'s my chance to blog about it!) I\'ve been connected in one way or another to NTEN for quite a long while. I joined back in the day when it had just taken over the gathering that had been called the \"Circuit Rider Roundup\" and became NTC. NTEN is, to my mind, a critically important organization in the ecosystem that is the Nonprofit Technology field, as the convener of the gathering that anchors the community, as well as a unique and necessary multi-dimensional resource at the regional and national levels for all sorts of constituents in the nptech world. I feel incredibly honored to be on the board, and to be able to give to the nptech community in this leadership role. Although I\'m sure that we differ in our perspectives on some issues, I very much look forward to working with the rest of the board to keep NTEN the vibrant organization that it has developed into, and to continue to speak to, listen to, and provide for its constituencies.

Continue Reading

{.post-icon .standard}

A goodbye to Facebook and LinkedIn?

On 07 Aug, 2007 By mpm With 6 Comments

I\'ve been experimenting with the non-content centered social networking sites LinkedIn and Facebook for a while now. (The content centered ones, like flickr, del.icio.us and our own Social Source Commons, are a different animal.) I\'ve been playing with LinkedIn for probably a year, Facebook only for a couple of months. It has been fun, in many ways, but I\'ve not figured out the utility for me in terms of my work, although others have had a better time of it. But, something has always been nagging me about them, especially Facebook. In some comments in a post of mine about Facebook, someone mentioned the article \"Facebook is the new AOL\" and I also mentioned an article I\'d read asking how open is Facebook, really? Facebook (and LinkedIn) are what people are calling \"walled gardens\". Even though it is true that anyone can join either network, the data in them is limited only to those who join, and join networks and have friends. I\'ve always been an advocate of open data and open standards, and Facebook is a great example of a one-way street. Wired says:

Therein lies the rub. When entering data into Facebook, you\'re sending it on a one-way trip. Want to show somebody a video or a picture you posted to your profile? Unless they also have an account, they can\'t see it. Your pictures, videos and everything else is stranded in a walled garden, cut off from the rest of the web.

I\'ve been slowly but surely realizing that the time and energy I\'m putting into Facebook is likely benefiting Facebook more than it is benefiting me. Yeah, it\'s fun that there is a great mix of people that I can keep track of (and they can keep track of me) - that\'s the part of the equation that\'s hard to find elsewhere. So I\'ve decided to, for now, keep my accounts, but dramatically curb my time with Facebook and LinkedIn, and spend more time exploring the ways I can use truly open technologies to do some of the same things. There are some great tips in this Wired article. And I\'ll also be experimenting with the XHTML Friends Network, which looks like an interesting start on an open way to connect people.

Continue Reading

{.post-icon .standard}

Time to find a fundraising solution that can\'t be bought

On 06 Aug, 2007 By mpm With 8 Comments

Blackbaud, which is one of the big gorillas in the CRM/Fundraising space bought a littler guy, eTapestry. This is not so far on the heels of the acquisition of Get Active by Convio. Blackbaud has done other acquisitions in the past. And, I\'m sure there are more to come. There basically are three types of software acquisitions that companies make. The first is to acquire a company that does something that you do not. For example, Yahoo bought del.icio.us - it didn\'t have social bookmarking. In those situations, generally, the product remains largely the same (with some branding changes over time.) The second kind of acquisition is to acquire a company that does something you do, but much better. Like Google buying YouTube, or Yahoo buying flickr. In that case, the acquiring company eventually does away with its own product, and the acquired product becomes that companies offering in that space (with changes.) The third type of acquisition is when a company buys a competitor, which may or may not have technologies that it has. In that situation, the acquired company is basically engulfed by the acquiring company, and eventually (or immediately, in some cases) completely disappears as an option. This third type of acquisition has been the hallmark of the acquisitions in the CRM/Fundraising space. GetActive is no longer an option to choose from. Nor is Giftmaker (bought by Blackbaud.) True, eTapestry had a platform that Blackbaud does not - but don\'t mistake that as the first or second type. eTapestry as a separate choice is bound to go away. And this is a bad thing for the many small organizations that have been using eTapestry for reasonable prices (or free). You have heard me rant and rail about the fact that the vast majority of money (both from nonprofits themselves, as well as by investors) goes into developing, maintaining (and acquiring) CRM/Fundraising software. This is something that, honestly, we as a sector are complicit in. And there are fewer and fewer choices every single day. Fewer choices means less competition, which means that prices will likely rise. And nonprofits often feel they have no choice but to pay big bucks for fundraising/CRM packages. If nonprofits want to have a good fundraising platform that they can know won\'t be bought and swallowed and changed so that they\'ll have to shell out more, it\'s time to invest money and effort in an open source platform. One already exists that needs support and development to make it ready to compete with the big guys. Allan Benamer says:

Obviously, Blackbaud is taking a page out of Oracle's playbook and applying it to themselves. Rapidfire acquisition of smaller players so that you can wrap it up into a system of systems seems to be their strategy for now. They now control the vertical fundraising environment for nonprofits from the base of the nonprofit market (eTapestry) to its apex (Target Software).

Blackbaud is publicly traded. It is important to think about the fact that dollars raised by nonprofit organizations are going to Blackbaud\'s investors whose major interest in Blackbaud is the profit it can produce. That is the driving force behind what Blackbaud is doing - maximizing profit. It is unrealistic to expect that acquisition mania in the CRM/Fundraising space is going to result in anything except fewer, more expensive choices. (Remember that as good and open and free as Salesforce is, it also can be acquired, and nothing is guaranteed.) We don\'t have to submit to the \"Buyout Blues\"! We have power and options in using open source solutions. Isn\'t it time we began to realize the power of community-owned and driven software that no one can buy? giftmaker.png

Continue Reading

Giving up, a little

On 06 Aug, 2007 By mpm With 1 Comments

As you might know, I migrated from using a MacBook Pro laptop as my primary desktop, to eating my own dogfood, as it were, and using Ubuntu Linux as my primary desktop. And, as you might recall, there were a few snags. My address book was a major one. And, to top it off, I had to make things more complicated last week, because I decided to get a Palm PDA again. (found a great, really cheap E2 on Ebay.) So there was the saga of migrating my data from Apple\'s proprietary address book to Evolution (and of course, dealing with the beast that Evolution is.) And then, I wanted to sync my new Palm. There were several snags:

  • A bug in Ubuntu which prevented the \"visor\" driver from being loaded at startup
  • Gnome-pilot/evolution can\'t sync more than one calendar category or one addressbook, or one to do list category, even though the palm has multiple categories
  • Another bug in Ubuntu which causes sync to crash if there are to do items with no due date
  • Jpilot, the alternative has a user interface reminiscent of, but worse than that of the palm desktop

So, basically, there wasn\'t a way for me to get a nice, usable sync for my data that was going to work for me. As I might have mentioned, it was this very thing that stopped me last time (although, admittedly, it was much worse last time - I had to recompile my kernel to get my palm to sync, and I drew the line at that.)  So, I\'m not giving up on using Linux as my desktop, but I am giving up on using Linux to hold my calendar, addressbook and to do lists. I\'m going back to using my mac for that - and installing Spanning Sync to sync google calendar with my mac calendar, so I have a calendar I can use on my desktop. Since Evolution is such a bad mail client, I\'m going back to Thunderbird. So I still don\'t have a good addressbook on Linux - and I certainly don\'t have one that is in sync with my Mac. (Yes, I know, I could install an LDAP server. Yeesh.) Sigh.

Continue Reading

Odds and Sods

On 27 Jul, 2007 By mpm With 2 Comments

(That\'s Brit/Canadian for odds and ends, if you were wondering.) I\'ve been a bad blogger over the past couple of months, I know. I haven\'t been posting near as much as I\'d like, mostly because I\'ve been rather busy. It\'s not just consulting work, but other varied happenings in my life as well. I don\'t know what the next couple of months will bring, but I\'ll try to generate some pithy blog posts for y\'all. I\'ve been re-reading a book that I read early in the decision-making process about becoming a consultant way back in 1996. It\'s called The Consultant\'s Calling, and it\'s really a book about consulting as a calling, a vocation. One of my favorite quotes about vocation is from Frederick Buechner, who said that a vocation is where \"the worlds deep need and your deep gladness meet.\" There is a new updated version of the book. It\'s really worth a read. I\'m convinced that I have a sense of humor. Really, I am. Except, well, I just don\'t get this LOL* pheonomena. I have to admit I\'m a sucker for cute pictures of kittens. But I have to admit the bastardized language thing (\"hai, I iz doin stuf\") just isn\'t funny to me. I am glad some people are enjoying themselves. And LOLnptech seems to have quite the following (just not me.) I\'ve been writing a series of articles for LASA\'s ICT Knowledgebase. The first article, on Mac database options, just got published last week. One on Open Office, and FOSS on the Mac are forthcoming. It\'s enjoyable writing them, and also writing them for a non-US audience (although I\'m sure plenty of US folks take advantage of their amazing resource.) This will be my last tech blog entry until at least August 6th. I\'m taking time off to work on science fiction writing!

Continue Reading

{.post-icon .standard}

Where the gift economy rubber meets the road

On 19 Jul, 2007 By mpm With 2 Comments

In the process of reworking and updating the NOSI (Nonprofit Open Source Initiative) primer that was first written in 2004, there are several things that have emerged that have dramatically changed from then. First is the wholesale movement toward the three major open source CMS platforms/frameworks, Drupal, Joomla and Plone. Second is that Linux servers seem to have made very serious inroads into nonprofit organizations, such that they are becoming almost commonplace. Third, almost everyone uses Firefox, or at least knows about it. The fourth, and very interesting development, is the relationships that have been developing between nonprofit-focused technology providers of all stripes and open source developer communities (at this point, primarily CMS projects.) There will be detailed case studies in the new primer, but what\'s been striking to me is how many examples of this there are out there. Technology providers are beginning to really invest in free and open source platforms, and it looks like everyone is benefiting - the organization, the clients, the developer communities, and, by extension, then, other providers and users of those projects. And so the feedback cycle keeps going. This may be where the gift economy rubber meets the road. Providers seem to be surviving (or thriving) with this model, free and open source software projects are getting the support they need, and clients are getting the software solutions they need. This is a model that is impossible with proprietary software. It\'s a model I hope spreads beyond the CMS space, into other areas. There are all sorts of worthy candidates!

Continue Reading

More on Facebook

On 17 Jul, 2007 By mpm With 3 Comments

Michelle Martin has a great post this week on Facebook. It introduced me to a new blog, called Read/WriteWeb, which I\'m liking a lot. They have a roundup of what they are calling the 10 best Facebook apps for work. In the last few weeks of using Facebook, I\'d already realized how it\'s fitting into my workflow in ways that no other social networking site does. I haven\'t tried a number of these apps yet. I don\'t intend, for instance, to start putting my calendar on Facebook, but it\'s interesting to see how much is happening, and how fast. I\'ve also noticed how most people in the nptech world who experiment with this stuff have moved over to Facebook. One of my questions is whether or not they dedicate any time to LinkedIn, or other networking sites anymore, or has Facebook become the one they spend most time on. I never did start a MySpace page, and I don\'t imagine I ever will. I was pretty doubtful about the general usefulness of content-less social networking sites (as distinct from those that are content-driven, like flickr and del.icio.us) but it seems that Facebook is becoming a platform, and has ceased to become simply a social networking site. One of the great things about the Facebook platform is the way that it can integrate online data. I\'ve got my flickr photostream up there, my del.icio.us bookmarks, and all of the varied data on the varied Facebook apps. But there isn\'t an easy way, for instance for me to see other\'s data without actually clicking through to their profile. But I\'m sure the interface will improve over time. But, still, although Facebook has been fun to play with, and many of my colleagues are using it, and it doesn\'t take away from my workflow - it hasn\'t actually helped me do much work. That\'s the next question - will my presence on Facebook help me find clients, or help clients find me? Will it help my work with clients? These are questions that are yet to be answered.

Continue Reading

Sweet tasting dogfood...

On 15 Jul, 2007 By mpm

Part of the process I\'m going through of \"eating my own dogfood\" that is, using free software (open source) tools whenever I can includes taking myself off of proprietary platforms whenever possible. One such platform was Typepad. Typepad is a paid service based on Movable Type, a very popular blogging platform, that is proprietary. They are going to release an open source version later this year, which is wonderful, but I also would have had to pony up another \$149 for a year of a Pro account, and that seemed excessive, since I could just as easily set up a Wordpress blog on the host I\'m already paying for. This migration, unlike the Mac OS -> Ubuntu migration, has been completely painless. A few tweaks (mentioned in the previous post,) and I was up and running with all posts and comments intact. Add a few important plugins, and I\'m back to where I was just a few days ago on Typepad. Two and a half years ago when I moved off of the blogging platform I wrote, I wanted a platform that would allow me to concentrate on writing, and not on tech. Two years ago, there wasn\'t a platform that was really ready for that. Now, there is. Actually, there are several. There is no question in my mind that free software has won the CMS/Blogging race, hands down.

Continue Reading

Welcome to the new blog!

On 15 Jul, 2007 By mpm

As you will have undoubtedly figured out - this blog moved! I\'ve moved it off of Typepad, and onto Wordpress. I\'ve been rather impressed by how easy Wordpress was to set up and use, and how easy the migration process was. I\'d recommend it to anyone. To recap, I took the following steps:

  1. Set up my blog on a different domain (in this case, it was zenofnptech.com.)
  2. Choosing a theme
  3. Migrate the posts and comments (exporting it from typepad, importing into wordpress - all web gui based, very easy.
  4. Modifying a few things (see this link.) There are a few changes. Wordpress now uses dashes instead of underscores, so that\'s something you don\'t have to worry about. All importing requires now is just going to the Wordpress import tool and specifying the file. That\'s all. You don\'t have to worry about using mod_rewrite at all.
  5. Getting the varied blogrolls and badges, etc. copied over.
  6. Letting people know
  7. Changing the DNS of the old site (and changing the site on wordpress.)
  8. Done.

The feed should stay the same. If, for some reason, yours stops working, try this feed.

Continue Reading

About

On 13 Jul, 2007 By mpm

Michelle Murrain received her B.A. in Natural Science and Mathematics from Bennington College, and her Ph.D. in Biology from Case Western Reserve University.  She first started to work with nonprofits and technology in 1996, when she assisted a local women\'s health organization with a Linux server that provided email and a website. That was also her introduction to open source software, which she has used consistently since that time. Michelle has been involved in developing content and applications for the web, specifically for organizational, research and educational purposes, since 1994. In 1996, Michelle started a consulting practice that served the non-profit and educational sectors, primarily in the areas of developing database-driven web-sites, the implementation of Open Source software, and strategic technology planning. She has worked with a wide variety of nonprofit organizations, mostly in human services, women\'s health and education. From 2003 until 2005, she worked with Database Designs Associates , based in Boston, MA. She was on the board of Aspiration , an organization that fosters software development in the nonprofit/NGO sector, and she presently is on the board of NTEN, the Nonprofit Technology Network. In 2005, Michelle took a sabbatical from nonprofit technology work to get her Certificate in Theological Studies at Pacific School of Religion in Berkeley, California. She is currently Coordinator of the Nonprofit Open Source Initiative (NOSI), and she does some strategic technology consulting as well. She blogs on other issues in her personal blog, Metacentricities.

Continue Reading

Eating my own dogfood. It sometimes tastes yucky.

On 09 Jul, 2007 By mpm With 3 Comments

So I talk a lot about both open source software, and the preciousness of one\'s own data. I rail against vendors who promote lock-in. I tout the benefits of open source software. So, here is a real life example of someone with a measly 195 records in her contacts database.\

As you might recall, I migrated from a Mac desktop to a Linux desktop a month and a half ago.  There are still some, shall we say, hanging chads. One big one was my address book. I used to have this great system where I used the Mac Addressbook, which would nicely sync with my cell phone. It also integrated well with Mail.app and iChat. It was great.

First problem: Linux address books ... suck. I hate to be so blunt, but it is true, at least in comparison to the ones on the Mac. There are basically three options. 1) Since I\'m using Thunderbird as my email client, I could use that as my addressbook. Except... it sucks. Really it does. Not enough fields, not a good ui. Ick.  2) KAddressBook. It\'s not as bad as Thunderbird, except, of course, it doesn\'t integrate with Thunderbird. It\'s just a bit more polished. More configuration, more options, but still not good. 3) Evolution. It would mean switching my email. It might be worth it. But the last time I tried Evolution, it was a horrible experience. But, that was 4 years ago. Open source projects do get better.

Actually there is a fourth option. I could dump all my addresses into one big flat file, and use grep. Right. Errr. NOT.

So my next task is to really try out evolution, and see how it works for me. I\'ll keep you posted.

But, there is more...

In order to use one of these address book options, I have to get my data out of Apple\'s addressbook. Turns out, there\'s no \"export\" menu item. Yeah, talk about lock-in! There is, luckily, a handy-dandy tool that will do it for you. Otherwise, you have to either write your own, or, worse, hand enter all those addresses again.

\<Insert sound of Michelle chewing on Purina Dog Chow.>

Continue Reading

See a problem? Throw a website at it!

On 06 Jul, 2007 By mpm With 1 Comments

Deborah, is, as per usual, diplomatic in her discussion of the site \"Sustainable Nonprofit\", which is a new(?) website that is designed to: \"create a unified place for nonprofits and experts to share their experience, pain, achievements, and discoveries.\"

I love the idea, really, I do. The site is beautifully designed, and engaging, with some good information and tools, as well. It\'s so nicely done, I wish it could indeed become that unified place. Except...

Everyone wants their site to be the unified place. Wishing, unfortunately, won\'t make it so. \"If we build it they will come\" only works in the movies.

In a conversation with a colleague over some wonderful Asian fusion lunch, we both agreed that in fact, the problem that nonprofits faced in the 90s, lack of good information, was, in a broad sense, mostly solved (there are, for sure, areas where there are gaps, but overall, the coverage is quite good.) In fact, we agreed we are getting toward overload - too much information in too many places. OK, so here is yet another website providing information that nonprofits need to be sustainable.

I think the time has come to think differently. Let\'s stop for a while throwing new websites at problems, and think more deeply about why those problems exist. When it comes to nonprofit sustainability, my hunch is that as wonderful as the intention is, a new website isn\'t going to make a dent.

Continue Reading

Gender, Race and Open Source

On 29 Jun, 2007 By mpm With 1 Comments

My session on Free and Open Source software and the US Social Forum went great yesterday. Lots of people were there (I ran out of handouts - I was surprised to see how many people showed up.) The presentation is available on my wiki (it\'s at the bottom.)

There were a very wide range of people there, from folks who didn\'t know a whole lot about open source, to those who were developing open source apps. Toward the end, a young man, who worked with urban kids of color on media and music, commented that he didn\'t really know how to get access to the kinds of things available, and he noticed how few people of color were in the room. He wanted to know how to get his kids access to tools that were affordable for them to create and edit media.

An older woman of color noted that a lot of the problems that open source developers were solving weren\'t problems that communities faced. Which, of course, has been an issue for me for a long time - lack of open source alternatives in the \"vertical\" application spaces - case management, etc. But it even goes further than just the vertical apps - what \"itches\" predominantly white, privileged (and, I might add, mostly young) men? And aren\'t those different that the issues other groups of people face? This is not, of course, to suggest that there have been no efforts to produce software that addresses social and human needs - there have been a lot. But their number pale in comparison to, say, the plethora of, say, network sniffing tools, for instance.

There was an interesting mini-conversation, which, in retrospect, I really wish I\'d had a chance to explore more, about the supposed \"egalitarian\" nature of free and open source software development. One person had brought up the idea of open source as a model for egalitarian participatory economics, and I made a brief comment that it wasn\'t all that egalitarian, really. My experience, and the experiences of many women who are involved in open source, make this clear.

Both of these things have lead me to think a lot about this topic. Of course, as an African American woman, I am a pretty unusual spokesperson for free and open source software. I most often find myself in a room full of people who are not at all like me (at least in the realm of identity - in actuality, they are a lot like me in inclination, but that\'s a different conversation.)  There were about 35 people in the room for the session, and about 8 were women, and about 7 or 8 were people of color (with overlap between the two - probably 25-28 out of 35 were white men.) This was the most diverse crowd I\'ve ever talked with or been in for an open source conversation. That speaks volumes to me. It is also true that it was far from a representative sample of people here at the US Social Forum (which is way more female and of color than that group.)

I don\'t really have any easy answers to this, but it makes me think more about what I\'m seeing as a gap, at least here in the US. We have a collectively-owned, freely available set of tools that are usable, and useful, and can even be used on older hardware. And communities that could make use of, expand, extend, and take ownership of these tools, don\'t have access to them, for a wide variety of reasons that at some point I should articulate, but have little to do with money directly. This feels like a different part of the digital divide. It\'s not just about access to resources in an economic sense.

Unfortunately, none of this is especially simple to address. But it needs addressing.

Continue Reading

An entire huge conference run on Free and Open Source software!

On 28 Jun, 2007 By mpm

I\'ve been rather remiss in my blogging lately, mostly because I\'ve been crazy busy, and blogging seems to be lower on the priority list these days. Today, I\'m at the US Social Forum, a huge gathering of activists from all over the United States, who have come together in the same model as the World Social Forum. The slogan for the US Social Forum is \"Another World is Possible. Another US is Necessary.\"

It\'s huge, there are 10,000+ people here, with hundreds and hundreds of sessions on topics as varied as global worker\'s rights, feminist economics, queer activism, and the rights of indigenous peoples.

The key thing for me about this conference is that the technical infrastructure - from the server of the site itself (LAMP+Drupal) the 90+ computers that are setup for registration, the media center, backbone routers, training labs, and email stations (Ubuntu), the entire registration system (customized interface on top of Drupal) is entirely running free and open source software. The operating systems are all Debian (servers) or Ubuntu (desktops), all of the desktops have firefox, open office and such.

There have been glitches, primarily on the user end. I had one person ask me \"How do I get to \'My Comptuer\'\" (which, of course, you can, but it looks different). I did question the decision to use Gnome on the user dekstops instead of KDE - since KDE is much more \"Windows-like\" in the places it puts things. I have to admit, though, in having to use Gnome a bit to set things up, and get documentation written, I\'m beginning to appreciate its more spare approach.

I spent two days in high-pressure mode on the tech team get some Linux routers configured, DNS working, a fileserver for media set up and working, and other odds and ends of stuff. It was a crazy couple of days, but I had a fabulous time, and learned an incredible amount of stuff over the course of those days. It makes me want to go home and order up some PC parts and start working on a new Linux box - I\'ve got lots of fun stuff to play with, now.

Continue Reading

Facebook the last frontier?

On 18 Jun, 2007 By mpm

OK, so I finally drank the Facebook coolaid - and although it took me a long time to get around to, I have now realized how many people in the nptech field have already been on facebook for a while. I decided to try out facebook when I kept hearing about the integration of other social networking sites into facebook.

Relatively recently, I\'d finally, after tons of invitations, joined invested time and energy into my LinkedIn profile. I\'m still not convinced by any of these social network sites not directly linked to content (unlike del.icio.us or flickr) - but it\'s fun to play with, anyway. And, it is beginning to appear that facebook will integrate better with my workflow than LinkedIn, because of facebook applications.

So we\'ll see how it works. I\'ll keep you posted. And, if you\'re already on facebook, add me as a friend.

Technorati Tags: nptech, facebook, web2.0

Continue Reading

A big player jumping into Linux

On 15 Jun, 2007 By mpm With 1 Comments

Intuit, the makers of Quickbooks, Quicken, and TurboTax, are jumping on the Linux bandwagon. Well, OK, they aren\'t really jumping, but putting their toe in the water in the Linux server realm.

They announced that their server product, Quickbooks Enterprise, will be released to work on Linux Servers.

One of the biggest issues for nonprofits in terms of adoption of Linux has been the lack of availability of ready-for-primetime applications like accounting. Intuit, the 800 pound gorilla in that space, moving to Linux is a great sign for the future. I certainly hope that a Linux version for Quickbooks on the desktop will be a next step. It would be a big step, but it would be a good step. Yes, it\'s proprietary, and I certainly wish that someone would write an open source worthy competitor for Quickbooks, but they haven\'t yet, so a good second choice is Quickbooks itself running on Linux.

Continue Reading

Circuit Rider School

On 11 Jun, 2007 By mpm

Way back when (last month - I\'ve been busy) Deborah Finn blogged about the \"New England School for Circuit Riders.\" That blog entry came about because she and I had a long conversation about what kinds of skills nonprofit technology providers needed, and what we felt was missing.

Hot on the heels of that (OK, not so hot - about 3 weeks later) I had a great conversation with my old colleague Marc Osten about some work Lasa is doing around providing support to technology providers in their neck of the woods (that\'d be the UK.)

I realize that we\'ve been having this conversation ever since NTC used to be called the \"Circuit Rider Roundup.\" Its not that there is a lack of technology vendors and support. It\'s that there is a lack of really good support - responsive, empowering, educational, integrative, and knowledgeable about, and invested in, the sector.

For those of us who\'d like to see organizations get better support - how do we do that? I think part of the answer has to be to provide the resources for people to become better providers - whether it be to help budding accidental techies get off the ground to become great IT staff or independent consultants, or helping individual and small consulting firms learn what makes really good nonprofit support.

There are many challenges - how do you teach self-reflection and self-evaluation? How do you teach the ins and outs of the nonprofit sector? How do you get providers to invest time and energy in what is really a marginally profitable business?

I don\'t have too many answers today, but living inside the questions for a while is always a good start.

Continue Reading

Varied and sundry

On 01 Jun, 2007 By mpm

It\'s been a week of mostly not work, which is a nice rest. I finally finished the first edited version of the scifi novel I wrote last summer. That feels good. Next steps are get some feedback, and move forward with it, somehow. I had a brief conversation by email with Cory Doctorow, a science fiction author who is also a copyleft activist, who releases everything he writes with a CC license. He suggested, basically, find the publisher first, then talk about the license second. That sounded like good advice, since it might take me quite a while to get to step 1. (If, perchance, you might want to read it, drop me an email.)

I\'m on week 3 of my Ubuntu laptop migration - things are smoothing out - I\'ve got audio working, I can listen to mp3 and audio streams. Flash (and, therefore, YouTube) is working, as is Java. I did a webinar for NTEN on it - ReadyTalk worked just fine. I still haven\'t figured out how to get higher resolution on my laptop screen, but that\'s mostly due to lack of time trying to get it to work. I also have a document nightmare - I have documents on the desktop, documents on my laptop, documents on external hard drives, aiii. I need to figure out a good network configuration.

There\'s been some interesting activity in the realm of women in open source. There is a podcast with a group of women developers that was recorded during RailsConf. It\'s definitely worth a listen. There is a part two coming, I understand.

Also, I\'ll be moving this blog soon - probably next week. I decided to move both of my blogs off of typepad, and to other platforms. My main blog is moving to WordPress, this blog is moving over to the Metacentric.org Joomla CMS.

I\'ll keep you posted on URLs and feeds.

Continue Reading

Ubuntu Linux, Week 2

On 22 May, 2007 By mpm With 4 Comments

Welll, it\'s not really week 2. I got the laptop a few days ago - but it was last week. I figured this was a good time to post an update, and complai... explain where I\'ve gotten to so far.

I\'m using it full time now, as my basic desktop. I\'m reading email, posting blog entries, searching the web, working on presentations, etc.  I have definitely hit some points of pain in migration.

What\'s fine:

  • The web was painless. I\'ve been using Firefox for a long time anyway, and all I had to do was install a few extensions (and google sync, which rocks) and I was up and running exactly as I had been before. And since so much of my workflow is in Web2.0 apps, it all works great.
  • I had converted to IMAP a while back in preparation for this change, so all of my old mail and folders are now sitting on a server. Thunderbird is a bit different than Apple Mail.app, so it\'s taking me a bit of time to get used to it.
  • Skype seems to work fine (I haven\'t tried to make a phone call, but I usually use it for chat anyway, and it works fine for that.)
  • There are a lot of open source apps that I\'ve already been using (XChat for IRC, Open Office, GIMP, Scribus) that work just the same, and can read and write all of the same docs I\'ve been using.
  • I found some good screenshot software.

What\'s been problematic:

  • Wireless networking - it took a bit of work to initially get it going, as I\'d mentioned in my last post. Now, it seems to work fine - I\'ve used it with two different open access points. I have yet to try it with a closed access point - I\'ve heard that WPA can be problematic.
  • Video - the video resolution that the generic driver has is lower than the resolution that my laptop can use. I had to install new drivers, and, I have not yet gotten a configuration to work yet. I posted this plea to the techtalk list on Linuxchix. Hopefully I can find a solution.
  • For some really odd reason, Konqueror, the web browser that comes with Kubuntu, can\'t see any external web sites. Every other program does fine (GAIM seems to flake out at times.) I haven\'t solved it, and I hate Konqueror anyway, so it doesn\'t really matter. But it\'s quite odd.
  • Proprietary media doesn\'t play by default. I totally get why this is true, and it\'s not Ubuntu\'s fault - it\'s the fault of those who license the proprietary media. I wish everyone would just switch to Ogg Vorbis - it would make life easier. But, fat chance. So I\'m having to download and install all sorts of strange stuff in order to play MP3s, Quicktime, etc. Installing Flash was kind of a pain, and I had to resort to the command line.

What\'s unclear:

  • I haven\'t done much with sound yet.
  • I don\'t know what I\'m going to do for an address book, and I don\'t know how I\'m going to get that to sync with my cell phone.
  • There are several key pieces of software that I use every day that I don\'t know how I\'m going to replace. They include the blog client, ecto - there really aren\'t any good solutions for Linux. There are also Journler and Scrivener - two great apps for which there are no  Linux equivalents (actually, there are no Windows equivalents for these either.) There are also a whole host of tools and games I\'ve gotten used to that there are likely no good replacements for at this time.

The bottom line - pretty much, if I were the type of person that did mostly email, the web and word processing, and the occasional spreadsheet or presentation, I\'d be off and running, and doing just fine. And, actually,  I am off and running, and doing just fine. But if I hadn\'t been so familiar with Linux, some of the stuff (like wireless) would have stymied me, if I couldn\'t resort to the command line (Ubuntu doesn\'t come out of the box with a decent wireless network application - if I were them, priority #1 for the next version would be seamless wireless, at least as good as is present in Mac and Windows.) I can\'t blame them for the driver problem for my laptop, really.

But since I\'m a power user, and have gotten used to Mac tools, which are great and user-friendly, it\'s going to be a bit painful at times, I think. But I\'ll be getting my work done, for sure.

Continue Reading

Linux, Ubuntu Feisty Fawn, and Me

On 21 May, 2007 By mpm

I\'ve been a part of the Nonprofit Open Source Initiative for a long time, and I\'ve been advocating for the use of open source software in the nonprofit sector for years. More lately, I\'ve been working to focusing my advising practice on helping people implement open source software (mostly server-side) in their organizations, providing advice and training. I\'ve installed more versions of varied Linux flavors than I could even think about remembering (going all the way back to the first or second versions of Slackware in the mid-90s). I\'ve been responsible for administering many Linux servers over the years, some Red Hat, some Debian.

And, for all of that time, the Macintosh has been my primary desktop. I had a (very) brief flirtation with Windows (2000) as my primary desktop, but ever since 1987, when I bought my first computer (a Mac SE) I\'ve owned at least one Macintosh. I\'m not about to change that.

I\'ve tried making Linux my primary desktop many times (5 at last count.) It was always something that got in my way of migration. In the beginning, it was lack of software (I first tried this back in 1999), or printer drivers. More recently (last time I tried this was back in 2004) it was not being able to sync with the palm treo I had at the time.

But, Linux has changed, and I have changed. And, in some ways, NOSI has changed - we\'re thinking more and more about talking about Linux on the desktop, which we thought was not ready for nonprofit primetime for a long time. I think it\'s ready now. I certainly will see. This is the 6th, and last time I will do this. Why last? Because I\'ve decided that no matter what, I\'m not going back. Because I want to understand, in the most personal possible way, what the pains (if any) of migration to an all free and open source platform will be.

So, I did some research, and realized that the best choice for me was to get a Thinkpad - most everything works right out of the box. I have been, unfortunately, a bit hampered by the fact that my satellite modem died last week - so we\'ve been on dial up at home (and broadband at the \"local\" cafe). But here\'s Ubuntu week 1, not edited or smoothed out. I\'ll understand points of pain, for sure.

Week 1

I should have taken pictures - unboxing a new laptop is a lot of fun. I got a Lenovo Thinkpad Z61m. Good specs, cheap price. My first step was to make sure the laptop booted. It booted fine. I stopped at the license agreement. I popped in my Fiesty Fawn (Kubuntu 7.04) CD that I\'d burned from a downloaded ISO, and rebooted. Once Ubuntu finished booting, I clicked the wonderful \"install\" icon at the top. Because the recovery media for this laptop was on the hard drive, and I also wanted to create a separate /home partition, I did a manual partition, deleting both partitions on the hard drive, and creating three partitions: /, /home, and swap. (I might regret hosing the recovery media w/o getting them on CD later, but I hope not - I was in a purist mood - I would have had to have agreed to the license agreement for Vista and activated the product in order to burn the media, and I wasn\'t about to do that.)

A few minutes later, I had a Ubuntu install with KDE - but it was bare bones. The next step was to get online. That\'s the first snag. Ubuntu doesn\'t come default with an easy GUI way to connect to a wireless access point. I had to go command line in order to get online. I imagine if I was wired, it would automagically work (that\'s been my experience in the past.) So I had to dig out of my memory (and do some online looking) about iwconfig. I also ran into a weird problem with a daemon called \"avahi-daemon\" which is basically the Linux implementation of \"Bonjour\". I\'m glad it\'s there, but it mucked with my network, and it seemed strange that it was on by default.

So, I got on my wireless network, finally, and got online (I had to use a CLI tool called dhclient to get an IP address. That was annoying.)

So, so far, the major pain has been the wireless stuff. We\'ll see how that works once I am able to download some of the good wireless GUI tools out there (like NetworkManager, which I hear is good.)

Next up, let\'s see how the details of migration (web, mail, address book, etc. work.)

Technorati Tags: linux, nptech, opensource, windows, ubuntu

Continue Reading

More FUD from Redmond

On 18 May, 2007 By mpm With 2 Comments

I hear, in my head, the famed quote from Rodney King: \"why can\'t we all just get along?\" Microsoft this week has started saber-rattling against Linux and other open source projects, by suggesting that they infringe on 235 patents that they hold. Of course, we all know that many of these patents were dubious to begin with - UI and business process patents that had no business being granted to anyone in the first place. It\'s \"Fear, Uncertainty and Doubt\" all over again.

Of course, the 800 pound gorilla doesn\'t actually have to sue anyone. Just threatening to sue, threatening to get license fees (which, for some open source projects would be a major problem) is enough to make people doubt the future of open source.

It\'s all about fear, really. Microsoft is a powerful company, with a lot of money in the bank, and a near ubiquitous market penetration in some quarters. Why can\'t they just focus on making good software? The software will speak for itself (or it won\'t.) It is amazing to me how much of what happens in business and within organizations is around fear. Fear that a company will lose market share, fear that they will stop growing as fast so that the stock price will fall, fear of competition. And, then, of course, helping to make other people afraid - afraid that an open source project or company will fold because someone sues them. Afraid that they might indeed have infringed on patents.

It makes me think a lot about how much we are governed by fear - even in the realm of things that seem only technological. But, of course, underneath, and around all of those bits and bytes are just human beings, after all.

Technorati Tags: intellectualproperty, opensource

Continue Reading

Carnival of Nonprofit Consultants: Better Late than Never

On 15 May, 2007 By mpm

This carnival is a day late, unfortunately. Getting construction work done on your house will make life difficult sometimes. But, finally, here it is...

A couple of the posts this week are things that I\'m thinking some about. For instance, I\'m reading the book, Made to Stick, and this week\'s post by Jeff at Donor Power talks about taglines that organizations use - how they make internal sense, but don\'t make sense to people outside. He provides exactly the kind of advice that will help organizations connect with their donors.

I\'ve been thinking a lot about different kinds of electronic communications, and what their advantages and disadvantages are. Solidariti has a great discussion and graph of Web 2.0 tools, and the characteristics of them, and how best to leverage one\'s effort to the best effect. It\'s a wonderful way of looking at these tools.

Some other great tidbits:

  • Also in the Web 2.0 realm - Cause Related Marketing has an interesting discussion about a new Instant Messenger campaign.
  • Don\'t Tell the Donor talks about a dustup between Greenpeace and the Salvation Army.
  • Kivi, of Nonprofit Communications tells us why organizations should pay attention to how we want to be listed. It makes sense that can make a difference in how people respond to appeals.
  • Nancy Schwartz, of Getting Attention tells us how best to get people to fill out surveys.

Technorati Tags: npcarnival, nptech

Continue Reading

What do you expect from a technology provider?

On 10 May, 2007 By mpm

In talking with some organizations, I\'ve come to realize that they don\'t have a handy list of things they should be asking of their network/desktop technology providers. (I think this might be applicable to all technology providers, but this is what is on the front of my brain at the moment.) Organizations without dedicated tech staff (and, I imagine, even some with) may feel at the mercy of providers, since they often don\'t have the technical know how to determine whether or not a suggestion, advice, or a fix that a provider might do would be helpful. And, if the provider speaks only tech talk, the organization staff feel stymied in figuring out what to do. Real life example:

Outlook on a couple of Organization A\'s computers is very slow to load, and slow to get email (others are fine). Very small network (\<6 users), using simple POP email. Technology Provider X suggests, without actually looking at Organization A\'s computers that they should \"Move the POP mail from an external server to host on Exchange on the onsite windows server.\" (This is not the actual words of they used, but this is the actual content and type of language used.)

To most staff, in most small nonprofit organizations without dedicated tech staff, this is completely greek. (And, for those of you that are geeks, also completely wrong.) What is Organization A to do? How are they to figure out 1) what this means 2) whether it\'s right?

So off the cuff, here are some things I think a nonprofit should ask their network providers:

1) To document ongoing maintenance that will be done on the network and on desktops, and how often (including virus updates, defragmentation of hard drives, drive imaging, backups, etc.) (and the org should follow up, to make sure these happen.)\ 2) To explain, when changes are suggested, what the changes are in plain english, why they think it will help, what they went through to figure that out, and what the ongoing ramifications of the change will be (like a change to hosting email internally on exchange will increase maintenance costs.)\ 3) To document system changes.

If we really care about nonprofits being able to accomplish their missions, we should care about what they know about technology, and how they approach it. We should desire to increase the internal expertise of the organizations, so they are better empowered to make good technology choices. And nonprofits should demand this of their providers.

Really, it\'s win-win. Nonprofits get better able to use technology to further their mission, and providers get clients that are active, engaged, and, likely, a lot less annoyed and less likely to find someone else.

Technorati Tags: nptech, techsupport

Continue Reading

Hosting the Carnival next week!

On 10 May, 2007 By mpm

I\'m hosting the Carnival of Nonprofit Consultants next week. It\'s an open call - so just send in your best posts for the week!

Send submissions to: npc.carnival@yahoo.com. I\'ll accept them until Sunday Midnight.

I look forward to seeing your posts!

Technorati Tags: npcarnival

Continue Reading

The problem with the word \"free\"

On 05 May, 2007 By mpm

Every time I start using the phrase \"free software\" instead of \"open source\" software in the context of people who are not familiar with what either of those terms mean, I invariably get questions about free (as in beer) software. \"Where can I get free software to do x-and-such - we don\'t have a technology budget.\" \"How can I find free software to do y-and-z?\"

Yes, it is up to me to make sure people understand what \"free\" means (like as in \"kittens\") - but it is these kinds of responses that send me back, invariably, to using the phrase \"open source.\"

I do think, on a philosophical level, using the term \"free software\" is to be preferred. But I wonder how much education we\'ll have to do before people understand what that term \"free software\" really means, and why the word \"free\" is so much deeper, and so much more important, than something that doesn\'t cost any money.

Technorati Tags: nptech, opensource

Continue Reading

Too much \"shiny\"?

On 01 May, 2007 By mpm

Jon Stahl quotes a comment by Ethan Zuckerman about \"shiny\" - the over attention to cool and groovy web 2.0 functionality. The punch line:

... there's a good chance that underneath the shiny is something that isn't very interesting. (Not always, but often.) And that some of what's deeply, truly, long-term transformative isn't shiny at all.

Yes!

Technorati Tags: nptech, web2.0

Continue Reading

Technology Support as Teaching

On 24 Apr, 2007 By mpm With 2 Comments

I\'ve been thinking a lot about technology support lately. Really a lot. Part of it is being prompted by my own technology support experiences with my satellite \"broadband\" provider (which have been largely frustrating). A lot of it has been because I have lately been exposed to situations where I have felt organizations haven\'t gotten the support they need, which, in our world, I think is all too common. As I move out of doing implementation, and into more evaluation, planning and facilitation of technology change within organizations, I wanted to spend some time articulating what I have tried my best to practice when I\'ve been in a place of providing technology support.

All technology providers have to deal at some level with support. Whether they implement a system, or build it, they will inevitably have the job of supporting that technology. Providers have many different ways of handling that challenge. Unfortunately, the most recent trend, which I have experienced all too much (and I\'m sure you all have too), is to simply follow a script with the person who needs support. It drives me simply nuts that every single time I call my satellite provider about a problem with the service, and I\'m saying \"I\'m seeing 80% packet loss, and doing a traceroute suggests that it\'s about 2 hops after your modem\" and they respond with \"OK, first, we\'re going to clear out your browser cache. Go to preferences ...\" It has been a challenge to resist uttering strings of obscenities.

But also, the question is - is providing technology support simply just an end in itself, or is it also a means to another end - that is, can it be a means to empower clients in appropriate technology use to further their mission?

I realized, in thinking about all of this, that the model of technology support that makes the most sense to me is to think of it similarly as a teacher-student relationship. I know, I\'m a born educator, and I\'m sure someone out there is saying \"if you have a hammer, every problem looks like a nail...\" But I do think there is some validity to this approach. Certainly, if you are a technology provider that values empowerment of your clients, this is probably a good model to consider.

So what is it about a teacher-student relationship that we can learn from to provide really good technical support? From my perspective, there are four elements to a technology support process with this as a model:

  1. Assessment - where is the client - both in terms of technology knowledge, as well as in terms of what they need at the moment?
  2. Empowerment - as you help them with a problem, teach them about the problem, and ways to troubleshoot (or possibly solve) the problem themselves in the future.
  3. Relationship - an ongoing relationship with the client
  4. Solution - providing the solution to their problem

First, Assessment. Where is this client, now? First, there is the question of what they know. If you have a relationship with them (see #3) you\'ll already be familiar with their technical expertise - so you\'ll know where to start. But there is more than that to assessment. What is going on for them? Is this a problem that is critical to their work, or a \"pebble in the shoe\" kind of problem - annoying, but not urgent? Are they trying to get a grant out, and they are scared they won\'t meet the deadline because of a technical issue? Are they angry? All of these are important to know and understand, so it\'s possible to meet them where they are. That\'s one of the hallmarks of a good educator - meeting a student where they are, tailoring the education to meet the needs of the student. It\'s also, I think, a hallmark of a good provider of technology support.

Second, Empowerment. One of the most common problems that someone who has built websites has, is the client calls up, and says \"the website is down\". And you hurriedly go to your browser, and, voila, the website isn\'t down. So now you take them through all of the steps to figure out why it was they can\'t see their own website. You can choose to take them through this problem so that they figure out at the moment what\'s up, and who to call, or you can take them through it so that next time it happens, they won\'t need to call you, because they\'ve figured out the problem really belongs to \"insert_some_other_technology_provider_here.\" Or, they\'ll call you because the website really is down. Teaching them about the technology behind the problem they are having, and helping them to understand what\'s involved in it, not only empowers them to deal with problems more on their own, but it also empowers them to solve other technology problems, and be more engaged in technology planning in the future.

Third, Relationship. All of this works within whatever relationship you have with a client. As mentioned above, if you\'ve worked consistently with a client, you know what their level of expertise is - this makes assessment easier. Also, you remember how much work you got done when a substitute teacher came to class? Not a lot of learning, but certainly a lot of spitballs. Consistency in relationship is as important to students as it is to people who get support from a technology provider. Usually, of course, with the huge technology providers, that sort of thing isn\'t possible. But with smaller providers it certainly is. Sometimes, even with larger providers, they manage to get around this by having detailed logs of conversations with you. I\'ve found that quite helpful in the past - it has surprised me when someone has said something like \"I see you called a couple of months ago with a problem regarding x. How has that worked for you since then?\" It was nice to feel like someone actually bothered to write it down, and for the person talking with me bothered to read it. In the past, for me, my ongoing support relationships with clients have been the way that I have learned the most about their organizations. It has allowed me to be proactive in working with them on technology, and incredibly informative in helping future planning. The relationship is a two-way street: just as they let us know about challenges they face with their technology problems - it\'s important for us to tell them about the challenges that we run into in working to support them. There is a level of trust that\'s important to this relationship. Honestly, it is the relationship I cherish most highly (even more highly than whatever they pay me).

Fourth, Solution. This is where the provider-client relationship differs most from the teacher-student relationship. Of course, in the end, the client needs their technology problem solved, as quickly and efficiently as possible. But I\'d argue that good assessment of where the client is, and where the problem fits in their work and organizational lives, empowerment of them to troubleshoot problems on their own, and an ongoing, stable relationship, will make the eventual solution of the problem a lot easier, more economical and less stressful for both the client and the provider than it might be otherwise.

Technorati Tags: nptech, techsupport

Continue Reading

Free as in \"Free Kittens\"

On 23 Apr, 2007 By mpm With 2 Comments

Deborah Finn pointed out this good post in a blog I have never read: ALA Tech Source. I haven\'t read that blog (yet) because I\'m not a librarian, although, I\'ve always thought what Deborah said on the ISF list: \"I have long thought that nonprofit techies should make a point of learning from and making common cause with tech-savvy librarians.\" She\'s ahead of me since she actually reads librarian blogs.

Anyway, there is a (soon to be classic) line: \"...all of these technologies are \'free\' as in \'free kittens,\' not free as in \'free beer.\'\"

I do think that is something that we have a hard time getting across to folks. I just had a great conversation with a IT manager at a medium-sized nonprofit that had implemented Asterisk for their call center. The bottom line, from his perspective, was flexibility. They saved some in cost from a proprietary PBX system - but then they had to spend more on support and the like - it was a wash cost wise. But what they gained, and it sounds like he\'s not willing to give it up - is flexibility. It takes more, because you own your own system. But then, you own it - you can do much more.

Open source software, like kittens, take care and management. Some software, like Firefox, is like that kitten that is easy - it learns to be litter trained once, and just sits on your lap (or in its little bed) in a ball and sleeps, and plays only when you want it to. Other projects take more care and feeding, and you might have to take it to the vet.

Technorati Tags: nptech, opensource

Continue Reading

Open Standards part 2: XDI and Data Integration

On 23 Apr, 2007 By mpm

Back in December, I had planned to talk first about document format standards before I plunged into XDI. But, a couple of things intervened. First, I decided to write a full blown whitepaper on document standards. So it will be a bit before it comes out. I think people (especially in the nonprofit sector) take document formats far too much for granted, and I think they deserve more treatment than just a blog entry.

I also had a chat with Andy Dale, of ooTao, and it provided lots of great fodder for an informational blog entry. So, here it is. I won\'t go nearly into as much detail as he went with me - at some point I\'ll write something much more substantial. But this is a good start.

What is XDI, anyway? XDI stands for XRI Data Interchange. It\'s all about standards for sharing data over the net via XML and XRIs (eXtensible Resource Identifiers - URIs on steroids.)

If you look at the basic problem - how does data source \"A\" talk with data source \"B\"? We\'ve done a lot of that via APIs - but that\'s a set of idiosyncratic solutions to individual problems (solving the Convio \<-> GoogleMaps problem is different than solving the Joomla \<-> Salesforce problem, for instance - lots easier than it used to be, but still atomized.) How can this be standardized?

It\'s important to understand that this problem has many layers. The first is the identifier layer. Who are you, anyway? Then - authentication - how do I know that you are who you say you are? Then there is authorization/trust - what are you allowed to do, what data can you see? And, finally, there comes the data sharing layer. That\'s where this is all leading, of course, but what if when you finally get down to that layer, I say \"tomayto\" and you say \"tomaato\"?

Each of the technologies implemented at these layers have to be optimized for different things - you wouldn\'t want your data sharing layer to have strong crypto, and be optimized for figuring out who you are, would you? That would be inefficient. So these layers are separate, and, in most situations, pluggable. For instance, you could plug OpenID into the authentication layer for internet transactions, and use Kerberos for internal organizational purposes.

So, to the bottom layer of XDI is optimized for figuring out how the data should be shared. For example - think of a lexicon for all the ways that \"First Name\" exists out there (\"given name\", \"First\", \"nombre\", etc.) - so it would be possible to share that data. Also, one idea that is a part of XDI is that some data is persistent, and some data is simply a link to persistent data - so the data doesn\'t hold my address, for instance, but it does hold exactly where (the XRI) to get my current address.

Andy and I talked a bit about his work in the nonprofit sector. He sees the sector as a great place to try these ideas out - because, for the most part, there is a much more open and flexible ethos around data sharing. I think that probably is mostly true, but as I pointed out to him, the sector is often years behind the for-profit sector in terms of technology. There is a pilot project with Kintera to expose a subset of one nonprofit\'s data to an XDI interface. There are others lined up to try it, and the hope is it will spread. I certainly hope it does, and I will be keeping track of this effort, for sure.

I think the idea of this kind of standard - moving data sharing beyond what we (barely) have now, which are these very atomized sets of solutions (even though they are solutions we badly need.) If every data-centric application (ooh, that\'s redundant) that a nonprofit implemented had a standard interface for data sharing - think about the possibilities there. Right now, it\'s still basically impossible to look at big pictures across a wide range of data domains. This kind of standard would make those kinds of analyses a lot easier.

So this is the next jump beyond open APIs: imagine SQL-like queries on any data, anywhere you were trusted, and across those sources. And I thought open APIs were the holy grail!

Technorati Tags: nptech, openAPI, openstandards, xdi

Continue Reading

This guy is right on

On 18 Apr, 2007 By mpm With 5 Comments

A blog reader introduced me to a new blog by a guy named Phil Jones. Among other great things, he has this amazing post about Microsoft, and their future. Basically, he argues that in the era of Web 2.0, the only really compelling platform they have is Excel. Read this post, it\'s dead on.

I\'ve always loved Excel (and, since I don\'t own a copy, I hobble along with Open Office\'s pale, pale substitute.) I\'ve thought that it was truly one of the best pieces of software ever written. Really. And it\'s amazing how much it can do, and how much an organization can do with it. There are plenty of very small organizations (and not so small) that run on Excel. Many shouldn\'t, but, some, arguably, certainly can. And if the ideas he suggests for bringing Excel fully into the new age were actually done by Microsoft (fat chance) that would make it even better.

Technorati Tags: microsoft, web2.0

Continue Reading

Netsquared Innovation Fund

On 16 Apr, 2007 By mpm With 3 Comments

David says of the Netsquared Innovation Fund process:

Advocacy is appropriate and good. Mobilizing your network to help you win by making your network part of the process is also appropriate and good.\ \ Mobilizing your network to game a voting process suggests a weak understanding of how communities and social networks create real change (as oppose to raising a buck).

I thought about blogging about the projects I voted for, with details on why I voted for the projects I voted for. But then the flood of \"vote for this\" and \"vote for that\" started in the blogosphere and by email, and I decided that I didn\'t really want to enter into that space.

There are a very large number of very good and totally deserving projects on that list. I was hesitant, frankly, about the whole idea about a semi-public balloting process. I like the idea of having the widest range of people vote on projects, but I found myself tempted by the \"oh, she works on that, I should vote for it\" voice, which I tried to temper as much as possible, and focus on whether or not a project fit the criteria. I think I mostly failed at objectivity.

Technorati Tags: netsquared

Continue Reading

Drupal, Joomla and Plone! Oh my!

On 15 Apr, 2007 By mpm With 4 Comments

At NTC, there was a lot of talk about the \"big three\" open source CMS packages that most people these days in our sector are using: Drupal, Plone, and Joomla. I\'ve had a fair bit of experience with Drupal - nosi.net is run on Drupal, and I\'d done a Drupal install once, and helped with some now and again. I hadn\'t had experience with either Plone or Joomla, but in talking to folks both at NTC and Penguin Day about Joomla, I got intrigued.

I have a new endeavor (see the last post) that needs a new website, and I figured, why not? I hear Joomla is dead easy to install, and I need dead easy right now, so let\'s try it. Well, guess what? Installing Joomla is dead easy. I could do it with my eyes closed. I set up a mysql database in my standard generic virtual hosting setup, copied the downloaded and unzipped Joomla folder into my webspace by FTP, and fired up my browser. Four or five clicks later, tada! A website.

Um, sorta. I guess that\'s where it gets interesting when you work with a CMS, right? What are all those content types, and where do they appear, and how do you get things to look exactly like you want them? It\'s the same, really, with Drupal, only different. CMSs do share that pretty serious learning curve - but I\'m getting over it, slowly.

So I like Joomla. Do I like it better than Drupal? I\'m not yet sure. It definitely focuses a lot on the eye candy, which is nice, actually - I like that the admin interface is pretty. I know, that\'s silly, but it\'s true. In some respects, it\'s easier to use, although in others, Drupal can be a bit easier. It\'s a tossup, so far. They both seem to pretty much have very similar feature sets. We\'ll see how I feel as I progress with it, and see how far I can go. I hear that all of the \"cool chix\" use Drupal, though (Linuxchix is about to launch it\'s new Drupal-based website.) There is, I think, a bit of a geeky bias toward Drupal. So, maybe since I\'m becoming a bit less of a geek, Joomla\'s a good pick? But Joomla is pretty darned geeky. Like what is a mambot, anyway?

Technorati Tags: cms, drupal, joomla

Continue Reading

Speaking too soon

On 15 Apr, 2007 By mpm With 4 Comments

I\'ve been doing a lot of thinking since I wrote my post, a few weeks ago, saying I was done with technology consulting. In one sense, I spoke too soon, although in another, I was right on. And, to some extent, this post is a bit self-indulgent, so if you\'re looking for some concrete technology talk, you might want to wait for the next post on Joomla. :-)

I first started doing technology consulting for nonprofit organizations in 1996, with a project for a local public television station (WGBY in Springfield, MA), to design a technology center for teachers to learn about technology and the internet, so they could apply that in their classrooms. It was a great project, and a success, since that technology center is still in operation today. Understandably, it has come to be somewhat different than I designed it back then, but it still feels good that something that I worked hard on is still serving people. And it was the sheer enjoyment of that project - of talking to many different people about needs and desires, thinking about how to appropriately use technology to those ends, that got me out of academia, and into the nonprofit and educational technology world.

I did a lot of planning, evaluation and training in the beginning - some on my own, some with Summit Collaborative. it was what I enjoyed most, and it was what I thought I was best at. But, somewhere along the line, I started to do more and more implementation, because, honestly, that was what my clients needed most at the time. I put in a few networks in the late \'90s (ugh, really, I pulled cable.) I started to do databases for organizations, and then, in 1999, I flew headlong into web application development, which became my specialty and mainstay until I took a break to go to seminary in 2005. At first, I liked it a lot. I liked being able to create things that I thought my clients wanted (and they thought they wanted.) I stumbled a fair bit along the way. I had a hard time being a successful business owner with employees (I pretty much suck at it, so I hitched my wagon to Database Designs Associates from 2003 until this year.) And I struggled mightily with my own capacity to build really good applications mostly without other developers to help out. It was really hard to try and write new applications building on a framework I\'d written a while ago, while simultaneously improving that framework, and keeping up with new things such as Ajax and RSS, mostly by myself. It just wasn\'t happening very well.

And as time wore on, I lost touch with people and organizations. I sat for hours (or days) at a time in front of my screen without contact with the folks I was doing the work for. And, if there was contact, it was most often on the level of \"can you fix this?\" \"can you add this feature?\" I don\'t blame them - they needed the fixes, and the features. But that was a pale shadow of the kind of work and contact I wanted with my clients. And I also struggled with the consulting business model. In the early days, as a business owner, I needed to think a lot about sustaining business (I had employees, and I wanted them to eat.) And later, even though it wasn\'t a large part of my job description, it still was something that I had a hard time with - like getting yanked out of my flow to answer RFPs.

For one long time client (I had this client for just about all of the span of my consulting career - they were my second client), I had a much fuller, richer role, even though much of the work I did for them was database and web application development, we\'d built a great rapport over time, and it felt wonderful when I got the chance to talk with them about bigger picture issues. But that was not so often, and, as staff in that organization left over time, that relationship changed.

When I came back from seminary, I was very clear that I couldn\'t do technology consulting in the way that I had come to do it. I couldn\'t bring myself to code or design databases, or write connections to APIs, or do any of those things that had become my bread and butter over the past 6 years. I wanted to work directly with organizations and people. So, it seemed to me that I needed simply to leave technology consulting behind, and move into doing things in a more spiritual vein, perhaps.

But then, I had something of an epiphany. And that epiphany was in my post about \"Technology Consulting 2.0.\" And the more I thought about it, the more it made sense to me, and the more I liked it. And the more I talked with other people about it, the more it made sense for me to do it. I will hold off for a while yet in my life working with people directly on spiritual issues, and work now with what could certainly be called the spirituality of nonprofit technology - finding balance and looking at the bigger picture. I\'m creating a new practice, called MetaCentric Techology Advising. It will include visioning and planning, evaluation and training. All of the stuff that I liked the most about nonprofit technology, and, honestly, what I\'m probably best at. And it\'s nice to know that all of the last 8 years or so as a \"technology vendor\" as it were, will be there as good experience and guidance as I work with clients.

I won\'t talk much about it in this blog again, but I thought it might be something people would want to hear about.

Technorati Tags: consulting, nptech

Continue Reading

The Wealth of Networks, Chapter 5

On 15 Apr, 2007 By mpm

I\'d taken a long break from Yochai Benkler\'s The Wealth of Networks - I had a lot going on, and, well, it\'s a really, really meaty read. But I picked it up again, and was in the middle of it around the same time as the discussions around the Journal of Information Technology in Social Change happened. And as I finished reading the chapter, it came clear to me that the chapter might well be Yochai\'s two cents on our conversation (not that I\'ve asked him, but some things seem kinda clear from this chapter.)

The chapter is titled: Individual Freedom: Autonomy, Information and Law. Basically, it talks about the kinds of ways that individuals live, and the kinds of things that increase autonomy, and things that decrease it. He starts out laying the framework: the networked information economy puts materials in people\'s hands for action, it provides non-proprietary sources of communications, and it decreases the extent that people can be manipulated by those they depend on for communication. He then goes into detail into each of these ways that the network information economy increases autonomy.

There\'s a lot in this chapter, and I can\'t possibly do it justice - go read it. But what I want to highlight is his section on autonomy, property and commons. First, because it bears most closely on issues of open content in the nonprofit sector. Second, because it\'s a set of concepts that are pretty new to me, and I found interesting, and the arguments compelling.

First, both markets/property and commons have something in common - the ability of people to have some amount of certainty that there is available to them a set of resources so they can, as Benkler says \"execute plans over time.\" I\'d just say, live our lives, or in the case of nonprofits, accomplish their missions. But markets and commons create these certainties in different ways, as you can imagine. Markets are dependent on the willingness and ability of people to pay for goods and services, and are constrained in certain ways. Commons are also constrained in certain ways. He says:

Whether having a particular type of resource subject to a commons, rather than a property-based market enhances freedom of action and security, or harms them, is a context-specific question.

Basically, we have to take things on a case-by-case basis. There may be times (I\'d say home ownership is a good one,) where a property-based market would enhance security and flexibility, and a commons-based resource might not. And there will be examples (see below) where the opposite is true. It is his opinion, and based on his arguments I agree, that a mixture of proprietary (market-based) and commons provides people with the most flexible set of resources leading to the greatest autonomy:

Given the diversity of resources and contexts, and the impossibility of a purely \"anything goes\" absence of rules for either system, some mix of the two different institutional frameworks is likely to provide the greatest diversity of freedom to act in a material context.

He goes on to say:

As to information, then, we can say with a high degree of confidence that a more expansive commons improves individual autonomy, while enclosure of the public domain undermines it. This is less determinate with communications systems. Because computers and network connections are rival goods, there is less certainty that a commons will deliver the required resources. Under present conditions, a mixture of commons-based and proprietary communications systems is likely to improve autonomy.

He thinks that if conditions change, including increasing peer-to-peer networks, and wireless mesh networks, a commons-based communications policy would increase autonomy.

Later in the chapter, when he talks about mass communications, he uses a great metaphor of storytellers. I won\'t detail it here, because this is already getting pretty long. But it\'s worth reading - it has to do with how free we are to tell our own stories, and to hear the stories of as wide a range of people as possible.

I think that his contribution to our discussion about open content in the nonprofit sector, would be that, since it is information (a nonrival good), and since information is both output (I write a whitepaper that people read) and an input (someone takes the information from that whitepaper, and updates it, or uses a piece of information about one of the specific aspects of that paper in another paper with a different focus) a commons-based approach is the approach that will provide the greatest security and flexibility. In other words, an approach that will allow nonprofits to best fulfill their missions, or in Benkler-ese \"execute their plans.\"

Technorati Tags: intellectualproperty, nptech, opencontent

Continue Reading

Open Source Feminism?

On 12 Apr, 2007 By mpm With 1 Comments

Beth Kanter, as always, has a great, informative summary of the Penguin Day activities last Saturday in DC. She\'s got some great video blogging, including a short one on \"open source feminism.\" Although women were only 25% of the Penguin Day attendees, that\'s actually pretty darned good for open source related events.

We\'d love to get more women involved in nonprofit open source - women from the nptech world who might not be thinking a lot about open source, and women from the open source community who might not be thinking a lot about nonprofit organizations. Let\'s get together!

Technorati Tags: nptech, opensource, penguindayDC, women, feminism

Continue Reading

What\'s coming up ...

On 10 Apr, 2007 By mpm With 1 Comments

I\'ve been reviewing my blogging plans, and I have realized that I have been quite remiss in continuing the varied overlapping series that I started over the past few months. So, over the next couple of weeks, I\'ll be digging back into some interesting territory. I\'ll be blogging a new chapter of Yochai Benkler\'s The Wealth of Networks (which, by the way, is about information and personal autonomy - it dovetails perfectly with the conversation about open content in the nonprofit sector.) I\'ll be talking more about open standards, including the open document standards war, and XDI and identity. And I\'ll keep talking about my thoughts on technology consulting, and open content. Also, Deborah Finn gave me the blogging assignment to apply just war theory to my approach to technology. It\'s an interesting assignment, one I\'m gamely choosing to accept. I\'m really looking forward to the next batch of blogging coming up, and I hope it turns out to be useful and engaging.

Continue Reading

Dialogue about JITSC, part 2: Open content models

On 09 Apr, 2007 By mpm With 3 Comments

This conversation is very interesting, and very useful.

Both Michael and Laura bring up some important points that I want to talk more about - the cost of providing good content, and ways to provide that good content in a way that is sustainable. There is no question that providing good content costs money - I have no illusions about that. And I do, very personally know that raising money in a traditional sense (from foundations, etc.) for producing content is difficult, and takes a lot of time. And I don\'t think that we have all of the answers yet to solve this problem - but it\'s a problem worth solving, a problem worth struggling with, and not just going down the path of least resistance.

I got on Michael\'s case about this primarily because his journal is about technology and social change - and, as he had said, he\'s made passionate arguments about the open content in the past. But ultimately, yes, I do think that all content that we provide to the nonprofit sector should be freely available, and under Creative Commons (or similar) licensing. That\'s the only way to provide important information to nonprofits that need it - some have a hard time affording even nominal fees for that sort of thing.

There are all sorts of interesting models for providing this content in this way, while still providing sustainability. Providing the online version as free and open, and charging for a print version (obviously, above and beyond just the cost of printing it,) is one idea. The open source community has all sorts of good models to learn from. Ways to leverage open content to get folks to pay for more premium services - in this realm it could be for training, or webinars, or those sorts of things. I think revenue sharing is possible - asking nonprofits who have resources to contribute to allow the content to be freely available to all, for instance. Michael\'s open bounty is a great idea, and I\'d love to help in any way I could to make that happen. There are collaborative content generation models - spreading the work out among more people. I also had heard of the publishing model that Peter brought up - allowing the authors to provide open access.

Believe me, between working with NOSI to provide good content, as well as thinking about what I am going to do with that science fiction novel I wrote over last summer that I\'d like to publish at some point (I realized that once I started this conversation, I forever closed off the option to publish it traditionally) I feel this issue very keenly, and very, very personally.

I do want to address Laura\'s concern about expectations. She says:

But I'll put an unpopular suggestion out there: I think we as a community also need to consider possible negative impacts of advocating that all content ought to be open. It's already very difficult to pay for the effort of creating great content; if in addition we promote in people's mind the idea that all content ought to be free, it's hard to escape promoting the idea that no content is worth paying for. Which puts us in danger of tipping an environment in which it's very difficult to support good content into one in which it's downright impossible.

It\'s an interesting comment, and I think that it doesn\'t take into consideration the way that gift economies work. A system where all content were freely available and under a Creative Commons license is a gift economy - in the same way as open source software, or wikipedia works in a gift economy. And there are great examples of sustainable gift economies out there, and ways that the \"real\" economy feeds gift economies. I think that it\'s always important to make clear in people\'s mind the difference between free \"as in beer\" and free as in \"information wants to be free.\" There is an educational component to providing free and open content. And I think we have to think about the negative impacts of providing content only to those who can pay for it - increasing an already evident digital divide between nonprofits that have the resources to pay for these kinds of content, and those that do not.

Technorati Tags: nptech, opencontent, publishing

Continue Reading

NTC Summary, and Nonprofit Technology Consulting 2.0

On 08 Apr, 2007 By mpm With 5 Comments

As I write this, I\'m hurtling through small towns and big cities on the train home. We\'ve passed through Baltimore - which reminds me of a project I did once, way back when, to work with a group of mostly small and medium-sized organizations on technology planning. In those days, the buzzwords were \"internet connectivity,\" \"networks,\" \"websites,\" and \"email.\" This was in the solidly web 1.0 world where many organizations still weren\'t even networked, still used dial-up internet connections, and had websites written in the earliest version of Front Page, or were done by the CFO\'s nephew.

I\'ve emerged from this week\'s frenzy of buzzwords like \"blogging,\" \"open API,\" \"e-advocacy,\" \"municipal wireless\" and \"social networking\" not surprised at how much things have changed, really, but how much they have stayed exactly the same. From the stories I\'ve heard this week, nonprofits of the size that I\'m most familiar with (small to medium-sized) still don\'t have in-house technology expertise to make evaluations about what directions to go in. They sometimes deal with vendors and developers that don\'t really understand their mission, don\'t speak their language, and don\'t tell them the truth (whether intentionally, or by a lack of self-examination.) They struggle mightily with software, no matter whether it\'s free/open source or proprietary, shrink-wrapped or custom-built, on their desktops or web-hosted, which they generally spend extraordinary amounts of time and/or money on. The buzzwords have changed and the technology has gotten more sophisticated - but the problems many nonprofits are facing are exactly the same. So I hate to throw cold water on the whole enterprise - but if the core issues that most nonprofits are facing haven\'t changed, and the situation isn\'t getting better, how is it that have we helped?

I also saw the conference with some different, post-seminary eyes. I was looking for the deeper purposes behind the implementation of technology. I was looking for the discriminating approach to adopt technology appropriately. I was looking for the big conversation - why are we doing this anyway? Is it still just in the pursuit of \"efficiency\"? Is it all just TCO arguments? And I also looked at this with post-implementation eyes. I spent 8 years implementing technology \"solutions\" for nonprofit organizations. I wrote thousands of lines of code and designed more databases than I can count. I think I truly did some good, and I know I made mistakes along the way. Mistakes I hope to learn from, now that I won\'t be doing implementation anymore.

Sometimes, the forward march of technology seems like this train I\'m riding on - inexorably traveling down the track of capitalist profit while nonprofits are hanging on to those little hand-powered trucks that we, the people who serve them in this realm are working really hard to pump up and down, so we can try and gamely keep up. And while they watch really large organizations zip by them in bigger, better vehicles, looking exactly like they know where they are going. But no one seems to be asking \"why are we on this track in the first place?\" \"Is being on this track going to really help me save the whales/feed people/organize/save the planet?\"

And it\'s making me think a lot about what I\'m going to start calling \"Nonprofit Technology Consulting 2.0\" (and yes, I\'m subverting the dominant paradigm.) I don\'t know yet whether I\'ll actually start practicing it, but I\'d like to think about it more. What would it be like if we could help nonprofits with the following:

  • Asking whether technology implementations in their organization in the past have really facilitated their mission? In what ways have they not?
  • Asking whether technology played a beneficiary, damaging or neutral role in internal organizational dynamics and staff morale?
  • Asking, before implementing a new technology - what problem is really attempting to be solved? is it a problem that can be solved in any other ways?
  • How does increasing use of networking technology, on-line presence, and internet communications facilitate or hinder work that is done face to face?
  • Making choices about technology not just based on cost/TCO or feature set - but to bring in issues of the effects on staff, organizational dynamics, and the role of factors such as organizational determination of data destiny, source and ownership of software, and environmental impact.
  • Being mediators between vendors and nonprofits - to look at issues that are technological, and issues that are about personality, behavior and organizational structure and dynamics (on both sides)
  • Looking at the bigger picture - how does what an organization does with technology affect the larger community, and the planet?

I\'m looking for ways that it might be possible to practice nonprofit technology consulting with head and heart, with a view to the bigger picture of our society and our planet, and the precarious place we are in as human beings at this time, and with a view that reflects my emerging belief that increasing human touch and human contact will do more, in the end, than many of our attempts to increase efficiency by using technology.

When I re-started this blog 6 months ago, I named it Zen and the Art of Nonprofit Technology for a good reason. I want us to pay attention. I want us to pay attention to what we are doing, and how we are doing it. I\'m very clear that there are technology implementations that are completely appropriate, mission-facilitating, and even good for the greater community, and good for the planet. I want to make sure that every single technology implementation is like that. My bet is that we might do a lot fewer of them if that were so.

As I keep thinking more about this, I\'ll be blogging about it. I welcome any feedback and conversation, either by email, or on comments and trackbacks on this blog.

Continue Reading

How do we do make change if we keep doing things the same way?

On 07 Apr, 2007 By mpm With 6 Comments

I had heard about this new journal a while ago, and it was sitting in some small corner of my brain, waiting for me to pay attention. I ran into an old colleague at NTC, and it came up, because he had been thinking of contributing to the journal, but decided that he probably won\'t, for reasons I will talk about.

The new journal, the Journal of Information Technology in Social Change, is, I think, a needed part of our landscape of resources for the sector. And the editors, both of whom I respect highly, are impeccable in their credentials to pull this sort of thing off, and make it successful.

But then I looked deeper. The journal is, basically, business as usual. It\'s peer reviewed (good), but it\'s got a rather restrictive license, and the content is not freely available. The licenses are as follows:

Personal License:

If you have purchased a copy/subscription to the Journal with a personal license, this means that it is for your personal use. You may make copies for backup purposes or to allow you to personally use this report on more than one computer. You may also print copies, but not for circulation of any kind [emphasis mine].

Corporate License:

For most of you, we recommend a corporate license. If you have purchased a copy/subscription to the Journal with a corporate license, this means that it is for use by people within your organization. You may make paper copies for internal circulation. You may post it to your intranet, so long as access to that intranet is restricted to those who work for your organization [emphasis mine].

In other words, don\'t make a copies for a workshop, or for a colleague who isn\'t inside your organization, and definitely don\'t make a copy for your mother to read.

But it\'s a journal about technology and social change! This goes back to my constant refrain - the means are the ends. How can we talk about technology in social change, while, at the same time, publishing in a format that limits the availability of this knowledge to people privileged enough to pay for it? How can we talk about promoting change when we\'re not pushing this content into the commons?

The Public Library of Science is a wonderful example of a reputable, respected peer-reviewed journal where articles are freely available to the public. They say:

Published research results and ideas are the foundation for future progress in science and medicine. Open Access publishing therefore leads to wider dissemination of information and increased efficiency in science ...

Which is, actually, a very practical down to earth argument. Benkler goes further, and I go with him:

Information, knowledge, and information-rich goods and tools play a significant role in economic opportunity and human development. While the networked information economy cannot solve global hunger and disease, its emergence does open reasonably well-defined new avenues for addressing and constructing some of the basic requirements of justice and human development ... More importantly, the availability of free information resources makes participating in the economy less dependent on surmounting access barriers to financing and social-transactional networks that made working out of poverty difficult in industrial economies. These resources and tools thus improve equality of opportunity. [emphasis mine]

I think it is incumbent upon knowledgeable leaders to provide models for how to do things differently - provide tools that foster social change in ways that foster social change, not in ways that help to sustain the status quo.

I invited Michael Gilbert to a dialogue about this, which he readily agreed to. Below is his response. We\'ll be continuing this on each of our blogs, with cross-linking. Please feel free to join the dialogue, either in comments, or on your own blog. I\'ll respond to Michael\'s response in another post.

[Thank you so much for wanting to start a dialogue on this issue.\ \ I would like to respond in three parts. First, I want to say a few words about my enthusiastic support for the critique of closed licensing offered by Michelle by reflecting a bit on my past actions in this regard. Second, I want to lay out as clearly as possible the circumstance that led to a decision to use a traditional closed license. Third, I want to invite people to participate in a conversation about how this could be done differently.\ \ As anyone who has followed my advocacy work over the last ten years will know, I am a fervent supporter of open licensing models as a profound public good. I started promoting the Public Library of Science to the readers of Nonprofit Online News as far back as December of 2002. I\'ve praised the innovation of the Creative Commons licenses on more than one occasion, along with Lawrence Lessig\'s other work and ideas. (I have in fact offered a great deal of content under Creative Commons licenses in the past and will no doubt do so again.) I have been a champion of openness of all sorts, including such things as open licenses and the destructiveness of DRM, in panel after panel in the nonprofit tech community for a decade. I have more than once written challenges of others similar to Michelle\'s challenge of me and I must say that I can only hope that I\'ve been half as courteous as she has been.\ \ Before I explain the circumstances that led to our licensing decision, I want to make one thing very clear. Although the Journal was prepared in partnership with NTEN, I take full and personal responsibility for the decision to use a closed license. Katrin Verclas (the Executive Director of NTEN, for those who don\'t know) was eager to know if there was any way to make it open and pushed hard for it. I am the one who, with the interests of the sustainability of my own small organization in mind, refused.\ \ The question of licensing is a terrible dilemma for authors, readers, reviewers and publishers right now and I happen to be all of the above. I\'m in an absurd position, personally. I want our efforts to reach the broadest possible audience and at yet on a gut level, I loathe the restrictive nature of the journal industry. At the same time, I have a small organization with an established based of customers that will pay for high quality information. (In other words, I have paying subscribers who have been waiting for this journal for months.) Most importantly, I have staff to pay. Thus, the journal has a fee, although we\'ve done our best to make the personal rate much lower than the organizational one and in no case are we anywhere near some of the stratospheric prices of many mainstream journals.\ \ I\'ve watched open journals fumble along and when they publish at all it\'s the result of great sacrifice on the part of the people publishing it. Some, that have a home in the extra time that some academics can spend on such things in their jobs, are almost sustainable. Others aren\'t at all. I\'m really not sure what the answer is. The overhead of finding sponsors for a small publication is enormous. We experimented with it briefly two years ago when we first decided to publish a journal, but we couldn\'t make it happen. Is there a business model that will make this work? I\'m really not sure.\ \ Quite frankly, nothing would please me more than to find a way to finance the expense of the journal without fees for licensed copies. The licensing is a pain for everyone. It\'s friction in the system designed only to create some financial accountability for the work involved in nurturing the relationships involved and husbanding the papers into the best form we can manage. Maybe the answer is to abandon that and just use the Internet for direct publishing by authors, but I don\'t think we\'re far enough along yet in developing network centric models that do what competitive selection, peer review, and editing will do. Maybe the answer is for a single donor to step forward and fund the next half dozen issues. Maybe the answer is some kind of quarterly bounty which, as soon as financial pledges reach a certain amount, the publication goes to open license (or maybe that\'s when the next issue is commenced). I really don\'t know. If you want to help figure it out, I would be very grateful.\ \ To wrap up, I just want to say thank you to Michelle for jumping on this right away. (I only wish you had been at the panel for the Journal on Friday where we talked about our larger goals. The licensing issue would have been a good piece of that discussion.) The sector benefits from this sort of criticism and we\'ll all be better off for it.]{style="color:#1f2c46;"}

Technorati Tags: 07NTC, intellectualproperty, nptech

Continue Reading

On 06 Apr, 2007 By mpm With 1 Comments

There has been a little bit of blogging and the like at NTC - although it was certainly hindered yesterday, when the internet was down for most of the day. And I think most of the bloggers are too busy giving, or going to sessions to blog much. But there are a few tidbits that I\'ve enjoyed:

  • There is a great Flickr stream building up of photos tagged with \"07NTC\" including one of me.
  • Michael Gilbert has a review of things happening on the varied NTC backchannels, with some wry commentary.
  • Michael also has a really interesting map of the nonprofit technology space.
  • Yet Another Anonymous Nonprofit IT Staffer (hmmm, does it say something that with some regularity we discover blogs by folks who feel the need to be anonymous?) has great commentary about getting staff buy-in to technology projects: it\'s the mission, stupid!
  • Charlie Brown, of Askoka\'s Changemakers, says \"It\'s not about technology - its about appropriate technology... Its about human behavior... What do people actually need?\" Yay, there are people who get it. I\'m sorry I missed that session!

After it\'s all said and done, I\'ll post my overall review. But the next step is Penguin Day!

Technorati Tags: 07NTC, nptech

Continue Reading

Technology Consulting 2.0

On 04 Apr, 2007 By mpm With 1 Comments

I had a great Day of Service with the Advocacy Project, which is a great organization that sends interns out into the field, to work with local partner organizations on issues such as human rights, women\'s health, peace, and many other issues. We talked about appropriate use of Web 2.0 tools for their interns, for themselves - for advocacy, fundraising, and information dissemination.

It was fun and engaging. They are an interesting and eclectic group, and our conversation ranged all over the map. But it felt useful, and I learned a lot from them. It made me think about what is important to me about consulting - why I got into doing consulting in the first place. I like talking with people. I like learning from them, I like working to give them concrete information they can use, as well as thought-provoking questions for them to ponder as time goes on.

And it reminded me of what I had been missing for all of this time in working to implement technology. It was the human contact, the human touch, the connection about more than just \"can you fix this bug?\" or \"can you build this?\" That\'s what I\'ve been missing.

Technorati Tags: 07NTC, consulting, web2.0

Continue Reading

Going to DC ...

On 02 Apr, 2007 By mpm

Well, my bags are just about packed, I\'ve prepared just about everything I can prepare. I\'m involved in two sessions at NTC: the Linux Geekout on Thursday at 3:30, and the Case Studies in Open Source Software at 10:30 on Friday morning. I\'m also facilitating two breakouts at Pengiun Day DC, one on Desktop Linux, and a second on starting open source projects.

It\'s going to be an interesting NTC for me. This will be my first in three years (I missed the last two.) I\'m going entirely with my NOSI hat on, and with a different perspective, since I\'m not doing technology consulting. I\'ll be thinking about open source, and about technology writing. I\'ll get to see some old friends, and meet some new ones. Email me if you want to make sure we catch up.

Technorati Tags: 07NTC, nptech, penguindayDC

Continue Reading

Penguin Day, DC

On 29 Mar, 2007 By mpm

I\'ve been really happy to be involved with Aspiration and PICNet in organizing Penguin Day DC, right after NTC. Please do come - it will be great. There are some amazing breakout sessions planned, and there will be wonderful energy. Here\'s the official blurb:

[Please join us for Penguin Day in DC, right after the Nonprofit Technology Conference (NTC)! We\'ll explore the potential and the role of Free and Open Source Software (FOSS) in nonprofit organizations, in sessions designed to answer your questions and curiosities!\ \ Recent agenda additions include leaders of the Joomla Team discussing FOSS communities, a passionate Plone practitioner from NetCorps sharing skills, and Beth Kanter, who will lead a discussion on Open Content.\ \ Penguin Day DC is taking place Saturday, April 7th, from 9am to 5pm at:\ \ Josephine Butler Parks Center\ 2437 Fifteenth Street, NW\ Washington, DC 20009\ Map Link: ]{style="color:#434343;"}http://tinyurl.com/2tj5a3[\ \ Register now at ]{style="color:#434343;"}http://www.penguinday.org[!\ \ The Parks Center is about 1/2 mile from the NTC at the Omni Hotel, and a shuttle will be available from the NTC hotel to the Penguin Day venue.\ \ See the latest Penguin Day DC Agenda at ]{style="color:#434343;"}http://penguinday.aspirationtech.org/index.php/Penguin_Day_Agenda[. Feel free to add your session ideas!\ \ If you are going to the NTC, come to Penguin Day!\ \ Penguin Day DC is organized by Aspiration, PICnet and NOSI.\ \ What is Penguin Day?\ \ Are you passionate or curious about the reality, the potential and the role of Free and Open Source Software (FOSS) in nonprofit organizations? Do you want to learn about latest free and open web publishing tools and technologies? Would you like to meet other like-minded and passionate participants, including developers, activists, and nonprofit \"techies\"?\ \ Penguin Day DC will bring together nonprofit technology staff with free and open source software (FOSS) developers for a day of learning and conversation.\ \ We\'ll explore and explain open source for nonprofits, frankly address the challenges of developing open source tools for nonprofits, and celebrate strengths and successes of open source in the nonprofit sector. Leading open source innovators in the nonprofit sector will share their stories and knowledge, and focus on answering your questions!\ \ If you are curious about open source software for your nonprofit organization, Penguin Days are for you!\ \ Register at ]{style="color:#434343;"}http://www.penguinday.org [\ ]{style="color:#434343;"}

Technorati Tags: 07NTC, nptech, opensource, linux

Continue Reading

Goodbye Microsoft...

On 29 Mar, 2007 By mpm With 2 Comments

Just today, I received in the mail some Sony Vaio Picturebook laptops, courtesy of Gavin\'s regular potlatch program. My goal was to bring them to the Linux geek out at NTC, and have people play with Linux installs on them. But, I realized that I had somewhat of a challenge on my hands.

Linux is supposed to be easy to install and use. And, in most situations, it is. And, if I had a USB CD-ROM drive, it would be, since it seems from the BIOS of the Vaio\'s that they can boot from USB external devices. But, the computers are networked, so it would seem - there must be some easy way, right?

Debian comes to the rescue. They have a site, with the URL: http://goodbye-microsoft.com, which is a link to a windows .exe file, which is a simple, sweet Debian installer. It works pretty well (at this moment, I\'m now downloading and installing the base system.) The one snag I hit (not unusual) is that originally, I was using the wireless cards for networking, but the basic Debian system didn\'t recognize them. So I switched to the very standard ethernet PC cards that Gavin so thoughtfully included, and bingo - everything works.

My next step, after installing the basic Debian system, is to switch the install to Ubuntu. That\'s easier than it sounds. Changing one file (/etc/apt/sources.list) and running a few commands, should do the trick. We\'ll see...

Anyway, if you\'ll be at NTC, come by and see where I got to. The geekout will be on Thursday at 3:30.

Technorati Tags: 07NTC, nptech, linux

Continue Reading

Disconnected and bored, or is there something else, really to social networking?

On 26 Mar, 2007 By mpm With 1 Comments

I continue to be fascinated with Twitter, and one of the primary drivers of the phenomenon known as Web 2.0: social networking. There have been some rather searing commentaries on Twitter lately from Nicholas Carr, and Kathy Sierra, among others. I\'m not going to rehash their interesting and cogent arguments, but I\'m going to ask some more fundamental questions about all of this.

In general, I have to admit that I have found very little usefulness in Web 2.0 social networking tools. Yes, I have an account on del.icio.us, digg, ma.gnolia and LinkedIn. I even had an orkut account years ago. There have been some interesting useful tidbits (I talked with an Apple recruiter, helping her figure out the best way to find people likely to be a good Genius for the new Holyoke Apple store, I\'ve found a few links now and again,) but for the most part, I have gotten back way, way less than I invested in signing up, linking, etc. I\'m sure this experience is different for different people, but I wonder whether people really feel like they\'ve gotten useful concrete benefit from the effort they\'ve put in. I\'ve gotten much more benefit from tools that are heavy on content, and light on networking (like H2Oplaylist, which actually isn\'t a social networking tool, per se, although it has some interesting functionalities in that regard.)

In all of this, I\'m reminded of Barbara Ehrenriech\'s new book, which I\'m going to read soon. It\'s called Dancing in the Streets, a History of Collective Joy. Her premise, as I understand it, is that modern culture has slowly but surely decreased our collective activities that connect us, and allow us to express and share joy. I also can\'t help but think about that oft criticized, but interesting book, \"Bowling alone\" about the reduction in social capital. It is pretty clear that we as a society we\'ve become more and more compartmentalized - each of us in our own little world, with our own little TV and internet connection - and we feel the need to connect with other people.

Back to nonprofit technology - a colleague and I wondered aloud together about the sheer boredom that nonprofit technology can be sometimes - and do new things like Twitter, or Second Life, or what have you, relieve some of that boredom? The boredom of databases, and networks, and accounting and ... But certainly, one could argue that connecting with other people around a particular social issue is useful for nonprofits. Finding ways to tap into, for instance, the vast network that is MySpace could be an avenue to find constituents, donors and volunteers. So I don\'t want to write off social networking, but it\'s also true that \"old-fashioned\" social networking via email lists is still going strong, and there seems to be no substitute for a real, live face-to-face gathering.

But also a push-back to nonprofit technology - if social networking tools like Twitter seem to be band-aids to help heal the wound of a disconnected society - what about the wound itself?

Technorati Tags: socialnetworking, twitter

Continue Reading

My life as an (almost) ex-Technology Consultant

On 19 Mar, 2007 By mpm With 3 Comments

Beth Kanter interviewed me for Blogher recently, and one of the questions included \"... you work as an independent consultant ...\" Well, those days are numbered. I decided several weeks ago, for a variety of reasons, to retire my independent consultant hat. I\'ve been doing this work for more than 10 years now, full time for about seven of those years. It\'s been an important part of my life for all of this time. I had decided to leave it two years ago to go to seminary, then, when I left seminary, I picked it up again briefly. I have now realized I need to set it down for good.

I\'m not leaving nonprofit technology, though, just this particular role - I expect to stay involved, keep connected, keep prodding and poking, and keep learning. I expect, in one way or the other, to be putting on a nonprofit hat. For right now, I\'m the part-time coordinator of NOSI. Whatever emerges next, you\'ll hear about it, for sure. (In other words, no I don\'t exactly know what I am doing next, yet.)

Being an independent consultant was, for me, a way to feel like I was using my skills for the greater good. I got to be a geek, and feel like I was really making a difference in the workings of organizations, and, hopefully, in people\'s lives. And, I think I did that. And I also spent lots of time wrestling with the demons of consultancy and for-profit-hood (or \"for-little-profit-hood\" as one consultant once put it.) If I had it to do over again, I would have started a nonprofit technology organization 10 years ago. Although it certainly could be argued that would have just involved different demons. Perhaps I\'m now more ready to wrestle those.

In any event, I have a lot of other skills and knowledge besides databases and coding: skills and knowledge in teaching, in writing, in working with organizations, in facilitation, in religion and spiritual practices, and in working with people, that I want to use now. I want to more directly work with people and organizations, primarily focused around faith and spirituality. I want to see people\'s real faces, and hear their real voices. I want to smell the sweat of working for change in our society, from the inside out.

Continue Reading

Open Source vs.(?) open data

On 19 Mar, 2007 By mpm With 2 Comments

I know most of you aren\'t surprised, but I\'m not the Richard Stallman of the nonprofit technology community. And it\'s not just because I\'m female. I\'ve never been dogmatic. I\'ve always known that when it comes to implementing free and open source software in the sector, pragmatics are important. (And, no, don\'t even think about comparing me to Eric Raymond!)

Lately, I\'ve been thinking a lot about Web 2.0 - and how that changes the equation for nonprofits. There are now three choices for many applications: proprietary, open source, and web-hosted. The web-hosted applications aren\'t software, they are a service. One could argue that whether or not they are open source is about as relevant as whether or not Google is open source.

Of course, there are all sorts of other reasons for people to choose open source software over hosted software. Data privacy and security is one really important one. (Some organizations with some kinds of sensitive data, like reproductive rights organizations, should always host their own data.)

But is open data a good substitute for open source? If a proprietary web-hosted service (most are) has lots of open APIs, providing free and easy access to data for an organization, is that OK? Is that enough? I\'m tempted to say \"absolutely.\" Of course, the best web-hosted alternative is one that is both open source and open data - these exist, but are few and far between.

This is, of course, from one single organization\'s point of view. From the sector\'s point of view, open source is better. One single organization certainly isn\'t going to be in the position to do anything if, for instance, if Salesforce.com open-sourced their code. But, a group of nonprofits who do particular kinds of work could potentially modify a codebase like that to create something that worked really well for them.

So, I\'m pragmatic. I want the best quality, most open (data and source), and least expensive solutions for nonprofits. I\'ve come to realize that can come in different kinds of packages.

Technorati Tags: nptech, openAPI, opensource, web2.0

Continue Reading

The scarcity mentality

On 19 Mar, 2007 By mpm With 1 Comments

Kudos to Michele Martin who brings up a sticky issue: the scarcity mentality. Her perspective is that the scarcity mentality of nonprofits (the idea that there is only one pie, and we only get our small slice) helps keep nonprofits from taking full advantage of social media (i.e. Web 2.0). I\'d argue that it also keeps nonprofits from collaborating together to produce amazingly good open source software projects (or, even closed-source, for that matter) that will help solve their issues and keep them from being captive to either predatory vendors, or vendors whose products, whether it be because of bad design, or not enough resources, promote data lock-in.

If ten similar nonprofit organizations came together to build a system that would work for them, they each would get 10 times the software that they could afford individually. But they are so busy living in that mentality of scarcity and competition, that they can\'t do that kind of collaboration. So it doesn\'t happen. Web 2.0, collaboratively developed software, and, really, collaborations of all sorts are limited by this mentality.

This reminds me of a true story. A long time ago (in web years) I was working with a certain CEO of a certain chapter of a certain very-big-nonprofit (whose role in life is to fund other nonprofits - this kinda gives it away, but it\'s necessary for the story.) We were talking about whether or not this certain nonprofit, who had mondo resources, should help facilitate web development for their client organizations. They had realized that if they did that, the client organizations could begin to raise money themselves, instead of depending so heavily on this certain nonprofit. So, guess what? No web development help. I was, of course, surprised (that\'s mild, I was frankly horrified - wasn\'t it the mission of this certain nonprofit to help the client nonprofits raise money? Wouldn\'t helping them raise money themselves fulfill their mission?) But that\'s scarcity thinking for you. Even though this very-big-nonprofit was rolling in money, they thought the pie was finite, and that if the money didn\'t go through them, they\'d get less. So the scarcity mentality isn\'t just for small, struggling nonprofits. It\'s very widespread.

Technorati Tags: nptech, opensource, web2.0

Continue Reading

Satellite Internet: Week 2

On 13 Mar, 2007 By mpm With 9 Comments

I promised updates on Satellite internet, and here are my impressions so far. In general it works well. We\'ve been getting download speeds from 1.5 Mbps (the advertised) to 300-400 Kbps during peak moments. The latency isn\'t too much of an issue for email or the web. It makes shell sessions basically impossible for all but the simplest stuff. FTP seems to work fine, as do streaming video and audio. I haven\'t bought anything from the ITunes store, yet, or tried skype for a voice call yet either.

The one caviat to all of this is what is called \"FAP\" or \"Fair Access Policy.\" In this regard, satellite broadband is fundamentally broken for any of the data heavy applications that many people want out of broadband. Basically, FAP is a threshold, and once you reach the threshold, your bandwidth is throttled down to what they say is dial-up speeds, but in fact, is much worse. If you recall my last post on this - what I had experienced was FAP. On my plan (the highest plan), if I try to download more than 400 MB of anything \"at one stretch\" (this is the term I was given by a tech support person) I\'ll get throttled. On the \"home\" plans, the threshold is a measly 175 MB.

Here\'s my (minorly edited) transcript of my chat with tech support:

[Taylor(Mar 6 2007 4:13:34 PM): Michelle, I have been through your usage data.\ Taylor(Mar 6 2007 4:14:38 PM): I have learnt that, you are subject to FAP, because you have downloaded 71MB,122MB and 211MB of data at a stretch.\ Taylor(Mar 6 2007 4:15:30 PM): The sum results to 404MB, which is greater that 400MB.\ Michelle(Mar 6 2007 4:15:56 PM): that\'s over 3 hours ...\ Michelle(Mar 6 2007 4:16:35 PM): is there a way that you can exclude necessary software updates?\ Michelle(Mar 6 2007 4:17:02 PM): I thought it was 400 over 1-2 hours\ Taylor(Mar 6 2007 4:17:14 PM): I am sorry. There is no way that we have that option for excluding the software updates.\ Taylor(Mar 6 2007 4:17:39 PM): I am sorry. You should be able to browse after 8-12 hours.\ Michelle(Mar 6 2007 4:17:44 PM): So over how much time do I have to space the dowlnloads then?\ Michelle(Mar 6 2007 4:18:07 PM): 404 over 3 hours is too much. What about 404 over 4 or 5 hours\ Taylor(Mar 6 2007 4:18:20 PM): Your download should not exceed above 400MB at a stretch.\ Michelle(Mar 6 2007 4:18:42 PM): at any one stretch? How long is a stretch?\ Taylor(Mar 6 2007 4:19:22 PM): If you try to download any data which is above 400MB at one go, you will be subject to FAP.\ Michelle(Mar 6 2007 4:19:55 PM): so if I wait, say, 20 minutes between downloads I should be fine?\ Michelle(Mar 6 2007 4:20:07 PM): but then doesn\'t regular web browsing add into that?\ Michelle(Mar 6 2007 4:20:50 PM): like if I download a 50 M file, then browse, then another 50 M file, an hour later, I might still be in trouble?\ Taylor(Mar 6 2007 4:20:49 PM): I am sorry. If you try to download any data which is above 400MB at one go, you will be subject to FAP.\ Taylor(Mar 6 2007 4:21:35 PM): If you stop downloading data before it hits the Download Threshold, you will not be subject to FAP, irrespective of the time taken to download.\ Michelle(Mar 6 2007 4:21:46 PM): but if I am using the net constantly, that\'s one go, isn\'t it?\ Michelle(Mar 6 2007 4:22:22 PM): no matter whether I\'m downloading files or doing email or browsing the web?\ Taylor(Mar 6 2007 4:23:28 PM): After your account has been restricted by FAP, you need to wait 8-12 hours for the FAP to be lifted.\ Taylor(Mar 6 2007 4:23:43 PM): Logging off of the HughesNet satellite network does not remove the FAP from your account, it should cause it to be lifted sooner.\ Michelle(Mar 6 2007 4:23:49 PM): I do totally understand why this policy exists, but the truth is, there has to be some way to distinguish between people who are downloading music and games and such, and people who are downloading necessary software updatees, which, unfortunatley, get bgger and bigger every year.\ Taylor(Mar 6 2007 4:24:57 PM): I understand your concern over this issue. I will try my best to forward this concern to the concerned.\ Taylor(Mar 6 2007 4:25:30 PM): You should be able to download the 211MB update once you have been uplifted from FAP.\ Michelle(Mar 6 2007 4:26:22 PM): yeah, but then once I download the update, I have to pretty much stop everything for a while. Sigh. OK, thank you very much for your time.]{style="font-family:monospace;"}[\ ]{style="font-family:monospace;"}

Notice, 400 MB is smaller than a Linux ISO. It\'s smaller than any movie, and is about as big as one TV show video at decent resolution. It\'s smaller than the sum of the Apple software updates I had to do. I have to plan my downloads carefully, and downloading an ISO requires a download manager I can pause and resume.

Am I happier with satellite? Sure. Because nobody these days designs websites for dial-up. 20% of websites didn\'t load at all. Another 40% were so slow I could go make tea and come back. It just wasn\'t going to be viable in the long term. Someone who also lives out here said that with dial-up, the internet feels broken. That\'s certainly true. But, satellite isn\'t really broadband. I hear it\'s improving, but it will never really be the broadband everyone else has.

Technorati Tags: broadband, hughesnet, satellite

Continue Reading

Back Channels, Workflow, Data, Twitter, and me

On 13 Mar, 2007 By mpm

I read Beth\'s recent blog entry on Twitter, and of course, the neo-luddite in me said \"waste of time!\" But then I had to think about it. What is it about Twitter that seems so, well, besides the point? Why aren\'t I Twittering away, like so many of my nptech colleagues?

One of the potential uses of Twitter is as a \"back channel\" for events and the like. So a conversation is happening, in a session, or in a plenary, and people are talking about that on another channel. IRC has always seemed to me to be one of the best ways to do that, but IRC is, well, passé these days, it seems. But why not add another channel to the back channel happenings, for instance, at NTC? Sounds like a reasonable idea.

But why don\'t I sign up for a Twitter account, and, well, Twitter? First off, it doesn\'t fit into my workflow. My workflow does involve the web, for sure, but I can only keep up with so many social networking sites of varied uses (social bookmarking, social networking like Linked in, community blogs etc. etc.) before I\'m spread far to thin, and Firefox starts piling the tabs into a menu. Then there is the overhead of signing up for an account, and then inviting people, and linking, and blah blah, just so I can tell people what I am doing at the moment. Huh?

And then there is data. We all talk about information overload. Give me tools, like Yahoo Pipes, or other kinds of things, that help me to whittle down the information I\'m taking in - make it more useful, help me find things faster and easier. I love my friends and colleagues. Really, I do. But, for the most part, I really don\'t want to know what they are doing right now. I need less, and better data, not more.

And there is more: it might keep me at my computer longer. It will use up server resources at some server farm which uses electricity (which means, probably burning dirty coal, or natural gas). Using servers that took a huge amount of energy and water to manufacture. All this so a bunch of people can tell each other what they are doing. And, I\'m sorry, but \"Insert_web.20_new_thing to change the world!\" just doesn\'t cut it for me.

Wanna know what I\'m doing? Check my Skype status. Or IM me. Those windows are already open, and I use them everyday. But sorry, I\'m not Twittering.

Technorati Tags: nptech, ntc2007, twitter

Continue Reading

Carnival of Nonprofit Consultants: Nonprofit Data Management

On 12 Mar, 2007 By mpm With 2 Comments

As you know, nonprofit data management has been a really important issue for me for a long time. So I thought it would be a great subject for the Carnival of Nonprofit Consultants hosted here today.

There are some great posts for today:

  • First, Katya\'s Nonprofit Marketing Blog has a great post with an article from Cheryl Gibson about implementing a CRM system in a nonprofit. It\'s chock-full of information about what you need to start, different strategies, and potential pitfalls. My favorite quote: \"A mutual understanding between the nonprofit organization and the database implementer that converting a database involves organizational change, and this can be stressful and threatening for employees. Both the database implementer and the nonprofit organization will need to establish in the project plan the metrics and deliverables that comprise success.\" I wish all nonprofits understood this!
  • IDI\'s Blogger Relations has some good ideas and resources on data management - they suggest, and I agree that it is critical to managing fundraising strategy.
  • Michelle Martin, over at The Bamboo Project Blog, talks about two cool web 2.0 tools, i-Lighter and Google Notebooks for managing online notes and such. I\'m a fan of Google Notebook myself, but I\'ve never heard of i-Lighter - I\'ll have to check it out.
  • Kivi at Nonprofit Communications, talks about how to keep track of the kind of data that writers need to keep track of - editorial calendars. I want to do more and more writing, and doing this sort of thing might be pretty helpful for me.
  • Beth has some advice for dealing with too much data - write it down (er, on a wiki that is.)
  • Finally, if data management stresses you out, here are some tips!

Keep track of the Carnival of Nonprofit Consultants, no matter which blog is hosting, by subscribing to the Carnival feed.

Technorati Tags: carnival, data management, nptech

Continue Reading

Carnival Hosting Again!

On 07 Mar, 2007 By mpm

Next Monday, I\'m hosting the Carnival of Nonprofit Consultants. The topic is \"Nonprofit Data Management: from slips of paper to CRM\" As you all know, I\'ve been thinking about data management issues for years, and it would be great for people to share their ideas and knowledge.

So, submit those posts by Sunday evening!

Technorati Tags: data management, npcarnival, carnival

Continue Reading

Satellite Broadband, Day 1

On 05 Mar, 2007 By mpm With 2 Comments

Dial up just wasn\'t going to cut it. As someone I talked with today said, \"with dial up, the internet feels broken.\" Someone else said that it would be good for designers to be forced to live with dial up for a while.

Anyway, it just got to be too much. I spent lots of time reading the reviews (many of which were BAD), but we decided to plunk down the dough (lots of it) and go for it. I figured I\'d start a chronicle of it. We got Hughesnet. They did have some really bad reviews on dslreports.com, but the other folks, WildBlue, weren\'t installing anything in our area.

Day 1 started out good. The install was very smooth, except the guy had to chip off a bunch of ice off of the roof. At first, he thought he\'d have to come back in spring when the ground had thawed, to put in a pole. But the roof works fine. He connected the modem, which I happily connected to my Airport Extreme - and we\'re off and running.\ Picture 1{width="446" height="100"}\ It didn\'t start out bad. The advertised speeds are 1.5 down, and 256K up. As you can see, the download speed is fine, the upload speed is positively zippy compared to what they say it\'s supposed to be. Pandora works dandy. YouTube ain\'t bad. It even snowed a bit - and everything seems fine. Shell sessions are basically not doable, unless I am amazingly patient - the 4 second latency is definitely a problem in that case.

But it didn\'t stay good. About 5:00, the bandwidth hit the floor. It bounced back up a couple of times. Now, it\'s at:

Picture
2-2{width="439" height="100"}

That sucks. At least I\'ll be able to upload this blog entry.

More on this saga soon.

Technorati Tags: broadband, hughesnet, satellite

Continue Reading

March Blogtipping

On 02 Mar, 2007 By mpm With 3 Comments

I\'m folllowing Kivi on the NPBlogtipping bandwagon. I think it\'s a great idea, although I\'m one day late. I promise in April I\'ll be right on time (oooohh, that could be fun, April Blogtipping on April fools day ...) By the way, I think blogtipping is a great idea for our sector, and it\'s a great way for me to think more about all of the blogs that I read.

I have one note: I feel a little strange giving tips - a lot of the blogs I really like are way better than mine. But anyway ...

1) Ed Batista\'s blog.

What I like: I like reading Ed\'s perspectives on organizations, how they work, and the kinds of strategies and approaches that are around to work within organizations. I like the wide variety of resources he brings to bear - he\'s introduced me to a lot of great writers and thinkers in varied areas. I like the design of Ed\'s blog as well.

My tip (more of a request): I\'d love to hear more about your perspectives on how technology can help (or hinder) organizational management, dynamics, and change.

2) Jonathan Piezer\'s Philantherapy blog.

What I like: As a longtime fan of JP\'s, it\'s great to hear his perspective on the nonprofit sector. He\'s down to earth, ruthlessly honest, and gets right down to the heart of things. I like JP\'s take on the development side of nonprofit issues.

My tip: JP, write more! And tag your posts so more people can find them - they are gold.

3) Deborah Finn\'s Technology for the Nonprofit and Philanthropic Sector.

A note: Deborah is an old college friend of mine.

What I like: I hear about a lot of new things from Deborah, and she explores a lot of new tools, and asks good questions. She uses her experiences to talk about technology issues. Her posts are thoughtful and accessible for non-techies. I like that she includes graphics in most of her posts. She has a fabulous, and up to date blogroll/link list.

My tip: Allow non-registered users to comment - I imagine you\'d get more comments that way. Also, a minor technological thing - the RSS feed is a bit wonky sometimes (strange formatting, and it\'s not always clear when there are new items.)

Technorati Tags: npblogtipping, nptech, blogtipping

Continue Reading

Computerless

On 20 Feb, 2007 By mpm With 2 Comments

My laptop is in the shop. I\'ll spare you the details. It\'s truly a practice in patience to live without a computer. Work doesn\'t get done. Blog entries don\'t get written. Emails don\'t get returned.

It\'s a lucky thing my partner has a laptop I can beg and borrow (stealing might cause issues.)

My practice in patience only goes so far. The Mac Mini I\'d been planning to get for a while just got ordered, next day shipping. It might even arrive before the laptop gets out of the shop. But even if it doesn\'t, at least the next time a computer dies, I\'ll have a backup.

Continue Reading

NTC on Passover and Good Friday

On 06 Feb, 2007 By mpm With 6 Comments

Relatively close on the heels of my post on Spirituality, I read a post on a blog I\'ve never read before, A View From Home. She is surprised that NTC is happening over Passover and Good Friday (April 4-6), and is having to make a tough choice and not attend NTC this year. She says:

What's done is done. Like I said, I love NTEN and I know that if they could turn back time and make a different decision they probably would. I'll have to catch the next east coast conference and hope that it's at a better time. But I can't help but wonder how the faith-based organizations that are NTEN members feel about this? Are all the vendors who come from the west coast who happen to be Jewish skipping their seders to travel?

She is surprised that no one else has talked about this time conflict before. Well, I was going to, but she beat me to it.

I\'m going to NTC this year. I\'m not celebrating Good Friday, etc. in any real observable way. It is too bad that I\'ll likely miss my chance to go to a seder, which I would have liked to do. But I\'m not really blogging about this for personal reasons. What I find most interesting is that when a survey was done of people who would go to NTC, many more people wanted cheaper hotel rates rather than to not conflict with holidays. I do know that in general, faith-based organizations are not well represented in NTEN - which makes sense - most faith-based organizations aren\'t large enough to pay tech staff, and don\'t have enough infrastructure to benefit from an organization like NTEN. The truth is the nonprofit technology field is overwhelmingly secular. I don\'t think this is a problem - it\'s just reality, an interesting reality.

Technorati Tags: 07NTC, nptech, nten, religion

Continue Reading

The Convio and Get Active Merger: Lessons for Open Source and Openness

On 06 Feb, 2007 By mpm

I listened in on the conference call about the merger of Get Active and Convio, because I was curious, and I wanted to find out what the lessons are in terms of both open source options, as well as openness of data. I was pleasantly surprised about how much was talked about in both of these realms. If this had happened a couple of years ago, I doubt much would have been said.

On the call: Gene Austin: Convio, Sheeraz Haji: Get Active, Tom Crackeler: Get Active, Dave Crooke: Convio

They talked about being excited by the openness of the Get Active architecture with Get Active Extensions - they expect to accelerate the openness of the Convio architecture. Sheeraz talked about having both development teams working on opening up the Convio and Get Active systems and APIs

They seem quite committed to provide openings and hooks into their applications that allow clients to get at their data. There was quite a lot of talk about APIs, and integrating the applications with other applications, including Google. They will use the need to move data from Get Active to Convio as a way to create ways to create external transactions and the like that will be opened up completely. Convio uses Salesforce for their customer relations management. They are a big Salesforce user, but they haven\'t had many requests for integration with Salesforce.

A question was asked about open source - whether they were moving in that direction. David Crooke talked about how they think that open source is a great model for developing software. Both companies use a lot of open source components in their development. They think open source has a lot to offer to the nonprofit sector. They don\'t envision opensourcing their codebase. The value isn\'t the software, it\'s the service.

And in terms of integration with open source CMS systems such as Plone or Drupal, as they develop integration between CRM and CMS we\'ll also put that in. Talked about Get Active hooks with Plone. They envision doing more like that. It will never be as tightly integrated as the Get Active CMS - but they want to make it possible to have their customers work with whatever CMS they want.

All in all, it was an interesting call. I\'m glad I listened in. It provokes the thought of a post on \"openness vs. open source\" that I\'m marinating in my head.

Technorati Tags: crm, fundraising, nptech, software

Continue Reading

Day of (Micro) Service

On 01 Feb, 2007 By mpm With 3 Comments

One of the things I\'ve always enjoyed about NTC (the Circuit Rider Roundup as it was called before that) was the Day of Service. It was a great opportunity to work with organizations I wouldn\'t get to work with normally, and, sometimes, to stretch myself a bit. It\'s a chance also, for a local group of organizations to benefit from the influx of nonprofit technology types coming into their locale.

So, once I had decided to go to NTC, I automatically signed up for the Day of Service. I was looking forward to it.

Then, the day came when the list of projects that one could get involved with came out - and each and every one of them involved a Microsoft product - whether it was a Windows network, training in Excel or the like. The one non-Windows project was MS Office training for the Mac!

No generic technology planning, no database planning, no open source, no internet or web anything (1.0 or 2.0). And, since I don\'t do Windows, and haven\'t used MS Office for the Mac in a while (and don\'t have it installed on my machine - so I couldn\'t even do a brush up,) I don\'t get to be involved in Day of Service. I know this isn\'t even representative of what nonprofits are dealing with right now. Sure, most of them depend on Windows and MS Office, but they have other wide ranging needs, like database planning, web sites, etc.

This post, perhaps fits in the \"gripe\" category, although I hope that no one will take this personally. This is especially not meant as a jab at Beth Kanter, the wonderful nptech blogger and all around guru, who\'s been running Day of Service forever, and does a great job with managing it, and herding the cats known as nptech folk. I know that she and a lot of people would want a more broad set of projects available. Perhaps next year?

Technorati Tags: 07NTC, nptech, nten

Continue Reading

Flung back 10 years and hurting

On 30 Jan, 2007 By mpm With 4 Comments

I\'m facing a reality that many people live with every day (like my parents.) And I thought I could live with it. I thought it would be fine. I thought ...

What is it? No broadband.

Where I\'ll be living quite soon is in, as some have called it, \"the land time forgot\" - Shutesbury, Massachusetts. It\'s a great rural town, with not a lot of people (population 1900). But the people are spread out far enough that neither the cable company, nor the phone company finds it worth it to install the infrastructure for broadband. And, cell phones don\'t work there either, so any cell-based broadband is out, too.

My options seem to be:

  • Live with dial-up and wait for the powers that be (Verizon, Comcast, someone else) to finally offer broadband
  • get really sucky satellite internet at astronomical prices with long contracts, and very extreme download limits (possibly too low to even bother with)
  • become my own ISP by getting a T1 and sharing it by WiFi or some other method (if that will even work, given how far our neighbors are from us.)

So, all I can say is that this seems to be a great opportunity for thinking deeply about what\'s important to me. There are things I take so completely for granted, like Skype, downloading big Linux ISOs, bittorrenting video files, etc. that I won\'t be able to do anymore, unless I pretty much go with option 3. Options 1 and 2 will limit what I can do fairly dramatically. Is all of that worth it enough for me? I can pretty much do any work I need to do with dial up (in fact, satellite will make things like doing SSH sessions impossible - so that\'s another mark against it.) I could rent an office in town. I can go to Rao\'s, or the Book Mill a few times a week. I could be patient - waiting for technology to catch up.

As a Buddhist teacher might say: it\'s all fodder for practice. In this case, practicing patience, and getting used to going to get tea while websites load.

Technorati Tags: puppy

Continue Reading

The Fundit

On 19 Jan, 2007 By mpm

As part of the Nonprofit Blog Exchange event #5, I\'m blogging about the blog called \"the fundit\" (I love that name,) which is a fundraising blog for Canadian nonprofits. Being a real fan of Canada, this is a great opportunity to learn more about the how that all works up north.

Her blog is full of concrete tips and resources for people who do fundraising in Canada. She also has some great links to more broad topics, which I was glad to read about:

Technorati Tags: fundraising

Continue Reading

Integration Proclamation

On 19 Jan, 2007 By mpm

I\'ve been meaning to blog about this for a while, but have gotten sidetracked. A while ago, a group of folks got together to create the \"Integration Proclamation.\" They say:

Technology integration, also called \"interoperability,\" means getting one program to seamlessly share data with another program -- ie, getting programs to \"talk\" to each other. If you\'re a progressive, you should care, because \"dis-integration\" is killing us.\ \ There are a lot of great tools out there for progressives -- email systems, volunteer databases, donation engines, social networking tools, the list goes on and on. But because these tools can\'t talk to each other, we can\'t use them effectively. Ask organizers about their tech tools, and you\'ll hear the same story over and over: too many overlapping databases, systems that don\'t work together, hours wasted importing and exporting and de-duplicating lists. In a recent study about progressive technology, lack of data integration was cited as the #1 universal complaint.

I\'m encouraging everyone to sign the proclamation, and, if you are a vendor or consultant, tell your clients you\'ve signed it, and are working to make integration between applications a reality.

Technorati Tags: dataintegration, nptech, interoperability

Continue Reading

Spirituality and Technology

On 19 Jan, 2007 By mpm With 4 Comments

A number of people have written me, and said that they appreciate that there is a blog with a spiritual take on technology. I initially intended to do a lot more about that, but got kinda caught up in the geeky stuff. (I can\'t help it.) But I do want to spend more time thinking about this issue.

One of the things that I have tried to do with this blog, and will continue to, is to get underneath the surface issues. Like getting underneath the surface issues of the recent CRM vendor mergers, or getting underneath issues relating to open source software. And, like the tradition that the name of this blog comes from, I want to look at technology without attachment or aversion - with an openness to different ways of thinking about, or doing technology in the nonprofit sector. I don\'t think I live up to that quite as well as I\'d like, given my preference for open source solutions. (Which reminds me of what was said by the 3rd Zen Patriarch - \"The Great Way is not difficult for those with no preferences\")

But it is all pretty unformed - how do I bring my deep commitment to spirituality (and, in fact, a commitment that is at the core of my life) to this work? How do I talk about these issues in a way that people from all perspectives and traditions can appreciate, from completely athiestic, to deeply religious? How do I help people to dig deeper into the core of issues when we usually spend a lot of time on the surface? These are the questions on my mind, and as I think more, and learn more, I\'ll write more here. Feel free to comment on things you\'d like to see me explore, or the kinds of things you\'ve explored yourself.

Technorati Tags: spirituality, technology

Continue Reading

The Zen of Nonprofit CRM

On 19 Jan, 2007 By mpm With 1 Comments

I was reading about the GetActive/Convio merger, and I have some thoughts about it...

It is clear that the CRM/Fundraising space is getting interesting, first with the entry of Salesforce, and, now, the mergers of GetActive and Convio, and Blackbaud and Target. Consolidation among vendors means that some customers will be dealing with different (and larger, potentially less friendly) entities. It also means fewer options. On the other hand, perhaps it means that these new, larger entities can provide services and resources that the smaller ones could not.

It doesn\'t really change anything. Nonprofits still have decisions to make about what software to use. And, it\'s still clear that CRM/Fundraising software is where the money and resources are going in nonprofit software development. And it doesn\'t change any equations about whether or not to choose open source solutions - they are still open, free, useful, but can\'t really compete yet in terms of usability and functionality in comparison to many of the commercial solutions, and that will remain so for as long as nonprofits choose to spend money on commercial solutions instead of pooling resources to collectively create and/or sustain and improve open source options.

In the final analysis, in the days, weeks, months and years following these, and other mergers, no fewer people will be homeless, no fewer women will be battered, no fewer children will be hungry, no less environmental damage will be done, no more people who need it will get mental health services. But a few more people will have a lot more money in their bank accounts. And this, I think, is one really important thing to think hard about. Are the means that progressive organizations use to reach their ends truly in line with their mission?

Technorati Tags: crm, fundraising

Continue Reading

The Wealth of Networks Chapter 4

On 12 Jan, 2007 By mpm

I know you\'ve been waiting for this. Here is, finally, chapter 4. This book is really good, but it\'s also very slow going. It will take me a while to finish it, I think. I\'m hoping to really read a lot of it in the next couple of weeks.

A note, for those of you that don\'t read my personal blog: I\'m moving on Tuesday, from California back to Massachusetts - a very long meandering trip that will take about a month (it\'s a long story - read the blog). So I\'ll probably be doing more blogging on my personal blog than on this blog, just because I won\'t have lots of online time, and I\'ll be more in a travel mode, than a thinking-about-technology mode. But I do have a bunch of things on tap, like continuing with Benkler, finishing my Open Standards series, and continuing the open source databases. I also have been doing a bit more thinking about what is, in some ways, the undercurrent of this blog: spirituality and technology. There have been some interesting ideas marinating, that I\'ll share soon. OK, on to Benkler...

Chapter 4 is called \"The Economics of Social Production.\" In this chapter, Benkler is laying out an important argument: people engage with social production for a variety of motivations, and that it is possible to generate economically significant amounts of effort with motivations that are not economic. In addition, the increasing involvement of social production in market-based business will change the way that business is organized. His basic argument is summarized as :

\"It is the feasibility of producing information, knowledge, and culture through social, rather than market or proprietary relations - through cooperative peer production and coordinate individual action - that creates the opportunities for greater autonomous action, a more critical culture, a more discursively engaged and better informed republic, and perhaps, a more equitable global community.\"

I think that\'s something we can likely all agree is a good thing.

First, he asks \"why do people participate\" - he talks about the simple economic models of human motivation - which assume that there are \"things people want, and things they want to avoid\" and those can be translated into money - a universal medium of exchange. He explains, with some great examples, of why these are wrong. \"If you leave a fifty-dollar check on the table at the end of a dinner party at a friend\'s house, you do not increase the probability that you will be invited again.\" He then talks about the importance of social capital over money: \"If you want to get your nephew a job at a law firm in the United States today, a friendly relationship with the firm\'s hiring partner is more likely to help than passing on an envelope full of cash.\" People would rather participate in some things for social standing and recognition, rather than money.

He then talks about feasibility and efficiency of peer-based production vs. market-based production, and comes up with this stunning statement:

\"A society whose institutional ecology permitted social production to thrive would be more productive under these conditions than a society that optimized its institutional environment solely for market- and firm- based production, ignoring its detrimental effects to social production.\"

His arguments are compelling, and interesting. He then talks about how social production has emerged in the digitally networked environment, and the ways in which it has interfaced with market-based production - using examples such as Red Hat and IBM. And he talks about how the relationship between users and businesses changes:

\"Active users require and value new and different things than passive consumers did. The industrial information economy specialized in producing finished goods, like movies or music, to be consumed passively, and well behaved appliances, like televisions, whose use was fully specified at the factory door. ... Personal computers, camera phones, audio and video editing software and similar utilities are examples of tools whose value increases for users as they are enabled to explore new ways to be creative and productively engaged with others.\"

The nonprofit take-away came to mind for me was to think about the model of nonprofits as passive consumers of software, vs. nonprofits actively engaged in collaboration in a peer-production environment - they are more able to define clearly what that software looks like, and how it works for them.

On to chapter 5...

Technorati Tags: books, intellectualproperty, nptech, opensource, peer2peer

Continue Reading

Tagging Discussion

On 06 Jan, 2007 By mpm With 1 Comments

Beth started a cross-blog discussion about tagging and folksonomies, and I thought I\'d weigh in. Gavin started this all off by posting a good and interesting set of questions about the efficiency of folksonomies.

I\'ll agree with Gavin, that folksonomies sure are less efficient, and a lot more messy than taxonomies. But is efficiency the most important thing? And, there is one really big thing that using taxonomies miss, that folksonomies get: who is doing the categorizing? Taxonomies are developed by specific people for specific purposes, and as such, are limited by worldview and perspective. Gavin says: \"I\'d recommend the wisdom of a few experts within that crowd.\" Good point, except - who are those experts? What is their worldview, and how does that effect the taxonomy that they come up with - and how does that determine the effect of a taxonomy on people who are not the experts?

I think that it is certainly possible to disseminate some guidelines (that some people will pay attention to) for the use of the nptech tag that could increase the signal/noise ratio. But I think the larger question about folksonomies is important: is efficiency all there is, and in what ways are folksonomies a way for the \"folks\" (rather than \"experts\") to have access to the process of categorizing their own content, and content they care about?

Technorati Tags: nptech, tagging, folksonomies

Continue Reading

Open Source Database solutions part I

On 01 Jan, 2007 By mpm With 3 Comments

I\'m throwing up my hands. Y\'all will just have to live with overlapping series. I have too many ideas be sequential. I promise (!) more on Open Standards and Benkler (actually, Benkler is up next - I\'ve got two chapters to review).

I\'ve been using databases since I was a grad student in the 80s, and I\'ve been designing and developing database-driven applications for the web since 1995. I\'ve been using varied Unix-based databases since then (as well as others including Access and Filemaker Pro), and most have been open source.

Although I\'ve been using databases for a while, I\'ve decided that I\'m going to focus specifically on open source databases for the next while, and, in particular, the different kinds of open source solutions that are possible for desktop database systems, or systems that might be server-based, but need a desktop front end. I\'m particularly interested in the open source technologies that are coming down the pike that might bump Access from it\'s perch as general-purpose nonprofit desktop database king, and that can provide nonprofits with flexible, robust data management solutions.

So here is my current survey of the landscape. I\'ll be working a lot with Open Office, and hope to design some screencasts using Open Office Base sometime in the next few months. I\'m starting this series off with just a list of the server-based DBMS. I\'ll be talking next about desktop DB options (which mostly use these as backends,) and then last about ways to put this all together in an all open-source landscape.

Server-based DBMS (DataBase Management Systems)

  • MySQL - MySQL is, I think, the most popular, and best known open source DBMS. It is cross-platform. It is the most popular because historically, it has been the fastest of the open source DBMS, but it has always lagged behind in terms of ACID compliance and other features. You can access a MySQL database via many many different drivers that people have written for just about any programming language. It is also possible to access MySQL databases via ODBC (Open DataBase Connectivity) or JDBC (Java DataBase Connectivity)
  • PostgreSQL - PostgreSQL has always been my favorite. I\'ve been using it since it was called Postgres95 - before version 6. (Wikipedia has a great entry on PostgreSQL, including some history). PostgreSQL has always been ahead of MySQL in terms of ACID compliance and robustness, and still is. It lagged behind MySQL for years because of speed issues (it was much slower,) but that has changed with the newest versions, such that in fact PostgreSQL is faster and more scalable than MySQL. PostgreSQL is also cross-platform, with binaries available for Linux and Win32 from Postgresql.org, and Mac OS via Darwin Ports. A PostgreSQL database can, like MySQL, be accessed via APIs written for just about all programming languages, JDBC, and ODBC (which I have quite a bit of experience with.)
  • Firebird - this is a newer kid on the block, sort of. It has a very long history, though, since it is based on Borland\'s InterBase codebase. It\'s doesn\'t have nearly the user base, or the amount of available tools as the others, but InterBase is a pretty interesting product, with some good features (like a small footprint, server performance tuning, and a great rollback and recovery system.) It is also cross platform.
  • Apache Derby - a DBMS written entirely in Java. This project has a small footprint, and is designed to be easily embedded in other Java projects. It comes with a scripting language and interpreter, called \'ij\' which is how you can interact with Derby on the command line. Also, of course, you can use JDBC is a way to access Derby. I\'ll be doing a fair bit of experimentation with Derby (\'cause I\'m curious.)
  • SQLite - a small footprint C library that implements an ACID compliant DB engine. It has a command-line tool, and it is possible to use C/C++ and Tcl for database access. Unlike the others, that are released under varied open source licenses, the code for SQLite is public domain.
  • There are a few others (see list here,) but they are either research-focused (like Ingres,) developed very little, or have small user bases, and seem not relevant to nonprofit technology.

[Nonprofit technology take home lesson]{style="text-decoration:underline;"}: MySQL is certainly the leader - it\'s most commonly thought of as the \"M\" in LAMP (Linux, Apache, MySQL, PHP/Perl/Python), which is a nptech web mainstay. I\'d argue that PostgreSQL is a better choice, but for most nptech applications, it doesn\'t matter - what matters is what your tech/consultant knows, and that\'s much more likely to be MySQL. The others are most likely of interest to pretty small niche groups, for specific kinds of projects.

Technorati Tags: databases, opensource

Continue Reading

I\'ve been tagged

On 23 Dec, 2006 By mpm

This will be my last post of the year - I\'m off to do some writing in a totally different genre. I wish all happy holidays, and a happy new year. When I get back, I\'ll complete my series on the Wealth of Networks, continue the series on open standards, and probably start a series on specific open source tools that I use on a regular basis. (I promise I\'ll try to complete these two before I start on the next one!) There will also, of course, be more neo-luddite, curmudgeonly posts on Web 2.0, software development in the sector, intellectual property, and other thorns in my side.

Angela of Grassroots.org tagged me, so it\'s my turn to tell 5 things most people don\'t know about me:

  • I\'m learning to play the bass guitar
  • I\'m learning spanish, because my partner is fluent
  • My upcoming goal is to bake all of our bread (I just this afternoon finished a wonderful Rosemary Foccacia, next on my list is challah for next Friday.)
  • I am a fan of Pandora
  • One of my favorite writers is Sherri Tepper

So, I\'m tagging Allen Benamer, Michael Stein (East Coast), Marnie Webb, Michelle Martin, and Jen-Mei Wu. Have fun!

Continue Reading

On 22 Dec, 2006 By mpm

Here are a few links to round out the year:

Continue Reading

Competing for nonprofit dollars

On 22 Dec, 2006 By mpm With 1 Comments

\<rant>\ Many of you know that I have a real desire to ease nonprofit pain in two particular areas: vertical apps, and data integration. This simply comes from my years of working with nonprofits who are struggling with their data issues, and need good solutions to them.

I just finished reading Allen\'s recent posts about the new wave of widgitized donation functionalities that some big (and not so big) players in the nonprofit technology web services space are pushing out. Yes, it\'s a good thing that there are lots of competitors in the field of CRM/fundraising in general, and a lot of them are doing some really interesting on-the-cutting-edge stuff, which is great.

What ticks me off is that by far, the richest (and I mean that in many possible senses) area of software development in the nonprofit sector is ... fundraising. I understand how important fundraising is (especially now as the coordinator of an organization that needs money,) but why aren\'t there 5 big companies jockeying for space to provide nonprofits with reasonably priced, say, client management packages? Or one of the thirty-five other mission critical tasks that nonprofits need to do to make the world a better place?

I know, I know, fundraising is one of the functions that almost all nonprofits share, and it is where the money is, and software developers have to make a living (er, well, Kintera is making more than a living - they are maximizing shareholder profit,) but if just a fraction of the time, energy and money spent on building CRM and fundraising software/services (how many gazillion of them are there?) went into other software and data needs of organizations, I daresay they might not be in as much of a pickle as they are in terms of making choices about vertical apps.\ \</rant>

Technorati Tags: fundraising, nptech, web2.0

Continue Reading

Social Networking

On 21 Dec, 2006 By mpm With 1 Comments

This week, I think I learned something about the social networking aspects of Web 2.0. It came from two different sources - Ma.gnolia and Linked In. They both have very different purposes - I happen to use Ma.gnolia primarily to store my own bookmarks for easy access, and only secondarily do I use it to share them. Lately, I\'ve made a couple of interesting connections with people because of my links.

Linked In is something that I resisted using for years, until it became clear that a lot of people in the nonprofit technology field are in Linked In, so it made sense to join that particular bandwagon. We\'ll see how far it goes, but it has been fun reconnecting with people, and looking into who knows who.

So I\'m learning, and experimenting. I\'m not sure I\'m convinced of it all yet, but it\'s interesting. And you can look me up at my Ma.gnolia bookmarks, and my Linked In profile.

Technorati Tags: bookmarking, socialnetworking

Continue Reading

What I\'m up to

On 18 Dec, 2006 By mpm

It\'s the end of the year, and it seems a good idea to post about what\'s on my agenda for the next year, and what kinds of things I\'ll be working on, thinking about, and writing about in 2007.

First off, for those of you that don\'t know, I am the new coordinator of NOSI - the Nonprofit Open Source Initiative. I\'ll be working half-time for NOSI, to start the ball rolling on some interesting projects on its own, and in collaboration with other organizations in the sector, including NTEN and Aspiration, among others, and to raise money to help make NOSI a sustainable organization going forward.

I\'m also still doing technology implementation work with Database Designs, and that work will be mostly maintaining and improving code that I generated before I went on \"sabbatical\", and increasingly doing more work at a meta level - project management, training and the like. I\'ll likely have more info on that stuff soonish.

I also am very interested in doing a lot more writing - the API whitepaper for NTEN should be out soon, and there are other things I\'ve got up my sleeve in terms of more concrete writing - some for NOSI, some just for the greater good. I will be focusing on open source tools for the most part.

I think it will be an interesting, and exciting year.

I will be blogging a bit more this week, but then I\'ll be offline for the Christmas week, doing other kinds of writing.

Continue Reading

Web 2.0 is getting beat up a bit (rightly so...)

On 18 Dec, 2006 By mpm With 3 Comments

Allen, one of my favorite bloggers (who I only recently started to read, which is my loss), has a great curmudgeonly post on Web 2.0. (I consider Allen a fellow neo-luddite, whether or not he agrees with that characterization.) He then follows it up with a pointer to an interesting post on the power consumption of avatars for Second Life, which should absolutely give everyone pause. I didn\'t even think about that aspect of it when I wrote my curmudgeonly post about SL quite a while ago now.

Holly seems to agree with him, although she\'s more of a cheerleader for Web 2.0 than Allen is, for sure.

What I\'d like to do is unpack Web 2.0, and give nonprofits pointers and resources around the specific Web 2.0 tools that will actually matter to them (which, to my mind, is Open APIs and RSS, basically, and maybe some collaboration tools like wikis, or blogging for some organizations for whom standing on a soapbox is an important mission-connected activity), and stop holding it up as a package that is not, as Holly says, going to be the sector\'s savior.

Let\'s give nonprofits the tools they are really going to use to make their lives easier, and serve more people. We geeks get to play in the sandboxes of Web 2.0, or whatever is coming down the pike. And that is certainly fun.

Technorati Tags: nptech, web2.0

Continue Reading

Things that make me feel better

On 14 Dec, 2006 By mpm With 2 Comments

As someone who has developed web database applications for clients, I always hate when they get errors. Things like this make me feel so much better. Even the big guys, with big budgets, mess up sometimes... It also means at least these guys are running windows.

Picture
2-1{width="136" height="250"}

Continue Reading

OSS User communities

On 12 Dec, 2006 By mpm

One of the things that can make (or break) an open source tool is the community around it. Just like evaluating a company that releases as specific application that you are interested in, understanding and evaluating the community around an open source project can be quite important.

Seth Gottlieb (a fellow Western Mass person) has a great post on his blog about how to go about looking at the communities around open source projects. It\'s definitely worth a read.

Technorati Tags: community, opensource

Continue Reading

Open Standards Part 1: Introduction

On 08 Dec, 2006 By mpm

So first a note. I\'m again doing this horrible practice of overlapping series. I know that I haven\'t finished my series on the Wealth of Networks - but I hit a snag in reading: my last papers to write for seminary, and transitions. So, once the papers are done, and things settle back down, I\'ll plunge back into Benkler, and keep going.

In the meantime, something that\'s been on my mind for years is the concept of Open Standards, and their potential value in the nonprofit sector. I think it\'s a really good topic for a series, because it\'s meaty, there\'s lots to talk about, and there is some news in that arena, around Microsoft\'s Open XML standard, which was just approved by a standards body, and Open Document Format, supported by Open Office, and others. I\'ll talk more about that in part II.

So first, what is an open standard? Wikipedia defines it best:

Open Standards are publicly available and implementable standards. By allowing anyone to obtain and implement the standard, they can increase compatibility between various hardware and software components, since anyone with the necessary technical know-how and resources can build products that work together with those of the other vendors that base their designs on the standard (although patent holders may impose \"reasonable and non-discriminatory\" royalty fees and other licensing terms on implementers of the standard).

So what this means is that if a standard is open, it\'s documented, and anyone can use it to create things. A great example of a standard is HTML. Any web browser anyone puts together can render HTML, anyone can write a file in HTML, anyone can write an HTML editor, and then someone can move that HTML from program to program. You can write an HTML document in Dreamweaver, then open it up to edit it in Mozilla, then open it up to edit it in a text editor, then ...

An open standard (in the software realm) gives developers the freedom to develop applications that use that standard, and users the freedom to take their data wherever they want, or move their data from one application to another freely, because the applications speak the same language.

So what\'s on my plate for this series? In the next post, I\'ll talk about the document format war. After that, I\'ll talk about identity standards (like XDI). I\'ll talk next about microformats, then I\'ll wrap it up talking about some possibilities for nonprofit focused open standards (like the seems to be deceased OPX.)

Technorati Tags: microsoft, nptech, openstandards, standards

Continue Reading

Wiki here, wiki there, wiki everywhere

On 08 Dec, 2006 By mpm With 1 Comments

Picture_1_6{width="200" height="80"} I\'ve become somewhat of a Wiki fanatic. Well, maybe not that far - but I love Wikis. I know I complained about the lack of Web 2.0 (including wikis) interoperability lately. But I have been using Wikis a lot these days, and I\'ll fill you in a little as to why I think they are grand.

I started using wikis about 4 years ago - when I went to a conference, and there was a conference wiki, for people to collaboratively take notes, add their bio and information about their projects, etc. That was the beginning of my love affair. In fact, I loved wikis so much, I wrote a wiki module to the now pretty much defunct open source CMS I\'d written.  I\'ve contributed to Wikipedia, and probably about a half-dozen other public wiki projects. But what I wanted to talk about here is how I use wikis everyday.

First, I use a wiki pretty much everyday to keep track of some of the things I need to do. The company that I do my technology implementation with has a wiki for technical documentation, and other things that the group needs to keep track of. I keep my to do list for them on that wiki, so that not only can I edit it, but others can edit it, add to it, take off things, help me prioritize, etc. (Yes, I will still complain that it doesn\'t integrate with my other to do lists, though.) Also, the technical documentation on a wiki is so helpful, because it is easily editable by multiple people, and we can see the history of the edits.

The second way I use a wiki very often is in the workings of the organization I am now coordinator of, NOSI (the Nonprofit Open Source Initiative.) We have a wiki used just by the steering committee, with agendas for conference calls, notes for meetings, ideas, projects, etc. It\'s very helpful to have all of that information in one place, and editable by the whole steering committee.

I\'ve decided that I like wikis even better than Google Docs for sharing content with people. Google Docs has it\'s advantages, and it\'s nice that you can generate a well formatted document when you are done, but if that doesn\'t matter, it\'s hard to beat a wiki in terms of ease of collaborative editing. Wikis are light, simple to learn to use, and pretty easy. And they don\'t need AJAX to make them work well.

My favorite wiki software, I think, is MediaWiki, which is the wiki that Wikipedia runs on. My second favorite, is PurpleWiki, by Blue Oxen. It\'s got some very interesting features, and I\'ve enjoyed working with it. I\'ll likely install it on my own server to play with it at some point.

Technorati Tags: nptech, wiki, web2.0

Continue Reading

Eating my words

On 30 Nov, 2006 By mpm With 1 Comments

I don\'t really think of myself as a pundit, probably because I am very willing to admit that I am sometimes wrong. Sometimes I don\'t have enough information and come to somewhat erroneous conclusions because of it. So, in that spirit, I\'m eating a bit of, as Katrin called it, humble pie.

A while back, I had a post called \"Metaphors\" where I lamented the fact that there had been a large movement toward using Salesforce. I questioned the use of business metaphors in nonprofit organizations (which is still something I find problematic,) and questioned the use of sales metaphors in software that nonprofits use.

Well, I didn\'t have enough information. Because I\'m writing a whitepaper on APIs for NTEN, I had a great conversation with Steve Wright of the Salesforce Foundation, the part of Salesforce that is giving away their services to nonprofit organizations. And I learned a lot about what they are doing, and why. And I think I\'ve realized that I jumped to the wrong conclusion about the use of Salesforce in nonprofit organizations. It sounds like they have some pretty interesting ideas about building horizontal platforms, that, in the end, might benefit the sector more than it detracts, especially given the kind of resources they have available to them.

So, live, work, talk to people, and learn more. There are some interesting things brewing in my head about open source and the new ways that the open source ethos and mentality is spreading faster and broader than the actual thing itself. But that\'s another post. This is the humble pie post.

Technorati Tags: nptech, openapis, opensource, salesforce

Continue Reading

My wish for Web 2.5

On 28 Nov, 2006 By mpm With 5 Comments

Well, both in the process of learning about all of the very cool web 2.0 apps out there, and beginning to try and use them to create content and organize my life, I have come to the following conclusion: the apps are great, but integration still sucks.

First, there\'s the blogging issue. I keep 2 blogs of my own, and contribute actively to one community blog (at nosi.net) and could, potentially, contribute to quite a number of others. I don\'t get paid to blog, so I don\'t really want to spend my time doing that. And, I also don\'t want to do too much cross-posting of content. But the community blogs do provide a way for a wider audience to read the content that I have created. Unfortunately, the nptech world hasn\'t yet caught on to the \"Planet\" phenomenon of the open source world (see Planet Ubuntu Women.) These are sites that are simply aggregators of the blogs of those involved in a particular open source project (like, in this case, women involved in Ubuntu). It\'s a great idea, I think. I\'m aggregated on live.linuxchix.org - a planet for those who are involved in Linuxchix. I think it would be very great to have a few nptech-focused \"planets\" out there. I think those are better ideas than community blogs - and it\'s so easy for people to get involved. (Hint to NTEN: Affinity Group Planets!)

Second, is bookmarking. I now have accounts at del.icio.us, ma.gnolia, furl, and stumbleupon. They each have their good points and bad, strengths and reasons I use them (I use furl, for instance, to save pages I think might go away, or become paid content after a short time being free.) But I\'m really getting toward the end of doing double or triple bookmarking. It\'s just so ... painful.

And then there are to do lists. Right now, I am a pretty multi-faceted person (well, I always was, but right now, I am very much so in practical terms.) I am working on several projects, both collective and personal. And I like and use to do lists. But, I have some to do lists I need to share, and others I don\'t. And, of course, not everyone I work with is choosing the same to do list or project manager. And even if they were, they might not integrate well (For example, I use three different Basecamp accounts - which don\'t integrate with each other.) Wikis are another great collective to do list/project management tool, but they don\'t integrate, either.

Luckily, it seems that most of the project manager type apps in web 2.0 land are either using iCal, or integrating with Google Calendar (my present calendaring software). So that\'s good. But the integration there even can be clunky. And it\'s one way. I can\'t change a basecamp milestone in Google Cal, I have to go back to that particular basecamp account to change it. Sigh.

It\'s a mess out there. Anyone going to help clean up? I\'m getting tired of wasting more time in front of my computer. I\'m ready to save time so I can go out in the sun.

Technorati Tags: linuxchix, nosi, nptech, nten, opensource, web2.0

Continue Reading

Eben Moglen on Software

On 26 Nov, 2006 By mpm

Watch this video. It\'s interesting, and should make us think a lot about why to use open source software.

Thanks to Jon Stahl for the heads up.

Technorati Tags: economics, intellectualproperty, nptech, opensource

Continue Reading

Ubuntu open week

On 26 Nov, 2006 By mpm

Next week is Ubuntu Open Week, a series of events and classes about Ubuntu Linux, and for people interested in getting involved in Ubuntu. The events are all on IRC (Freenode). I\'ll be sitting in on a few, I\'m sure, mostly for curiosities sake. Ubuntu seems to be becoming the linux distro of choice for a lot of people, and so far, it\'s my favorite. Using Red Hat, as I often do sometimes, feels like doing battle in comparison. Ubuntu took the best that Debian had to offer, and left the weaknesses behind, I think.

Anyway, I think it\'s worth checking out.

Technorati Tags: linux, nptech, opensource

Continue Reading

Preferred nptech instant messaging protocol?

On 20 Nov, 2006 By mpm With 5 Comments

I have, for a while, maintained accounts on just about all of the IM protocols out there (AIM, ICQ, Jabber/GTalk, MSN, Yahoo) mostly because there are some people that I know who are on one of the less used ones (like Yahoo or MSN). And most of my work colleagues use AIM or ICQ, if they are on IM. Lately, however, I\'ve been noticing that a lot of nptech folks use skype possibly exclusively. I like Skype, and certainly the ability to actually talk with people is really useful (I have both SkypeOut and SkypeIn as well.)

So, is this a trend? What are people\'s opinions on using Skype vs. other IMs? Do you mostly use Skype to IM or actually talk? Inquiring minds want to know.

Technorati Tags: instantmessaging, nptech

Continue Reading

Open Source News

On 19 Nov, 2006 By mpm

Here are some tidbits from the open source world that might be of interest...

  • Sun makes Java open source. This is a big one. A few components (like the compiler javac, and others) have been open sourced under the GPL, with the rest of the SDK to follow next year. Find details at the Open JDK Project.
  • Make has a kit for an open source mp3 player. Yes, open source hardware. Cool!
  • This is old news, but I\'m finally getting to understand it. Some really big 800-pound gorillas (Microsoft and Oracle) are bullying their way into the open source sandbox. The Oracle issue is much more straightforward - Oracle unveils \"unbreakable Linux\" - providing support for Linux that severely undercuts Red Hat\'s support prices. There are some interesting theories afloat about this one (a ploy to then do a hostile takeover of Red Hat?) The second was the deal with Microsoft and Novell. Basically, they have agreed to collaborate on technologies and support. Here\'s the kicker. Novell is paying Microsoft basically protection money. Microsoft agrees to give Novell customers indemnity against any patent or IP challenges. Eben Moglen thinks that this deal will be dead in the water because of the GPL 3.0. I\'m not so sure, since no software project has to choose to adopt 3.0. It does mean that there will be a lot to watch in the next year or so.
  • After you camp out, and fend off the violent hordes to get your Sony Playstation 3 - you can boot linux on it.
  • Watch this documetary on Net Neutrality:

Technorati Tags: java, oracle, netneutrality, linux, puppy, microsoft, mp3, opensource

Continue Reading

The Wealth of Networks, Chapter 3

On 16 Nov, 2006 By mpm With 2 Comments

I bet you thought I\'d stopped reading? Or given up? Nah. It gets chewy, for sure, but it feels like every chew is worth it. I\'m reading this book at the same time as I\'ve been working on the Nonprofit Open Source Initiative. I\'m realizing that all of the justifications for why I am so into open source and free software is right here in this book! So here\'s the summary for Chapter 3.

Chapter 3 is a discussion on Peer production - it talks about how it is that people have come together to collaboratively create software and content - basically, knowledge production. A salient quote:

... the networked environment makes possible a new modality of organizing production: radically decentralized, collaborative, and nonproprietary; based on sharing resources and outputs among widely distributed, loosely connected individuals who cooperate with each other without relying on either market signals or managerial commands. This is what I call \"commons-based peer production.\"

He talks about three examples which have become classic - free/open source software, SETI\@Home, and Wikipedia. He spends a fair bit of time talking about the Wikipedia model, and how, basically, amazing it is.

The important point is that Wikipedia requires not only mechanical cooperation among people, but a commitment to a particular style of writing and describing concepts that is far from intuitive or natural to people.

He then spends some time making clear how the new networked environment makes peer distribution possible. Napster and and it\'s follow-ons are a prime example:

What is truly unique about peer-to-peer networks as a signal of what is to come is the fact that with ridiculously low financial investment, a few teenagers and twenty-something-year-olds were able to write software and protocols that allowed tens of millions of computer users around the world to cooperate in producing the most efficient and robust file storage and retrieval system in the world.

He then talks about something that I find really interesting, and hadn\'t fully understood until I read it: why the radio spectrum was regulated in the first place, and why now, regulation is basically moot. It\'s really worth a read.

In the next chapter, he will talk about the economics of social production, and the motivations behind peer content creation.

Technorati Tags: media, opensource, economics, peer2peer

Continue Reading

Women and Technology

On 15 Nov, 2006 By mpm With 3 Comments

One of the things I really like about the nonprofit technology community is that there are so many women involved. There are lots of women on the varied lists I read, there are nonprofit technology organizations that have lots of women leaders, and all of that is great. But then, there is the little secret (well, it\'s not so secret). When you look at systems administrators, or coders, or net-heads ... the women kinda vanish. When it comes to conversations about things like the innards of APIs (REST or SOAP?), why Ruby on Rails rocks (or doesn\'t), what\'s a good alternative port to run SSH, when we\'re going to implement IPv6 or ... there\'s a whole lot of testosterone, and not a lot of estrogen hanging about. So where did the women go?

As someone who was a real rarity in my early years (how many African American women neuroscientists have you heard of?) I didn\'t ask this question too often (it would just depress me.) But as I re-enter this field I love, I can\'t help but think about this question again.

This is why, by the way, I love hanging out with Linuxchix. This community has been around for a while, and its full of women who know their way around a linux kernel (some of them even get paid to hack it,) and can answer just about any question on Apache mod_rewrite I can come up with. There are some really great men who hang out too, who don\'t mind being around a bunch of geeky women.

So maybe, we can get some Linuxchix to get involved in the nptech community, and we can get some nptech women who might be a little shy getting their toes wet with technology installing linux and writing code, with Linuxchix support, and have some nice synergy.

Technorati Tags: linux, linuxchix, nptech

Continue Reading

This is brilliant

On 09 Nov, 2006 By mpm With 3 Comments

This is great. It\'s the announcement for NTEN\'s video/mashup contest. You gotta watch the video!

Technorati Tags: nptech, nten, 07NTC

Continue Reading

APIs - what, how, whither, and writing

On 09 Nov, 2006 By mpm

I\'ve been asked by NTEN to write a whitepaper on APIs, following their Open API debate. I\'ve been learning about some interesting examples of the use of APIs in nonprofit organizations, as well as learning what vendors (both proprietary and open source) are thinking about the issue. I\'m looking forward to getting into the meat of the writing. It\'s funny, I write a lot here on my blog, and I forget how much I enjoy technical writing (or semi-technical, in this case.)

In the process of reinvigorating the NOSI (Nonprofit Open Source Initiative) website, I took a look at the Primer on open source software I wrote eons ago in Web time. I think it needs some updating, but it\'s actually still pretty relevant. That\'s a good thing.

If you have any API wisdom, examples, strategies, what have you, that you\'d like me to hear about, please drop me a line.

Technorati Tags: nosi, nptech, openapis, nten

Continue Reading

My Gmail experiment is over

On 02 Nov, 2006 By mpm With 5 Comments

I like Gmail, like a lot of people. I decided to give Gmail a whole month of my total attention. I forwarded all of my mail to Gmail, and used only Gmail for about a month. I dug all of that AJAX goodiness. And now, I\'m going back to the Mac OS X Mail.app.

Mail.app has it\'s drawbacks, but the filing mechanisms that I\'m used to (nested folders) work so much better for my brain and work than the label system that Gmail uses. And, OK, I\'ll admit it, I\'m not liking so much some of the ads that have showed up next to emails from my SO, or my friends. Who wants advertisements for cough suppressants, or sleep aids, or ... well, you can guess.

Mac OS X Leopard is supposed to include a major upgrade to Mail.app. So, perhaps some of my issues with it will be dealt with. Who knows.

Continue Reading

Carnival of Nonprofit Consultants

On 30 Oct, 2006 By mpm With 4 Comments

This Carnival of Nonprofit Consultants has given me a chance to read some blogs I don\'t usually get to read, since I\'m so often focused in the tech field. All of these articles were interesting and thoughtful. It\'s really nice to learn more about what people are talking about.

And in realms more familiar:

Continue Reading

The Wealth of Networks, Part II

On 29 Oct, 2006 By mpm

Chapter 2: Some Basic Economics of Information Production and Innovation

This is a really interesting chapter, where he lays out the basic economic theory behind information production. He basically starts out with asking what is the most efficient way of producing information, in the sense of the greater good. Basically, the most efficient for society\'s greater welfare is if everyone gave information away for the cost of distribution only. He says that the standard reason why people say that exclusive rights are important is that this will encourage information production and innovation:

\"In order to harness the efforts of individuals and firms that want to make money, we are willing to trade off some static inefficiencies to achieve dynamic efficiency. That is, we are willing to have some inefficient lack of access to information every day, in exchange for getting more people involved in information production over time.\"

This is, in fact, a critical issue. He further says:

\"If information producers do not need to capture the economic benefits of their particular information outputs, or if some businesses can capture the economic value of their information production by means other than exclusive control ... the justification for regulating access by granting copyrights is weakened.\"

He goes on, in a variety of ways, to show that both of these things are true.

He talks about quirks of information production, and the concept of rival vs. nonrival goods. A rival good is something that if you have it, I can\'t - if I want one, someone has to work to get/make it. Food items are rival. Cars are rival. A nonrival good is something that both of us can have at the same time, without any additional labor or resources. Electronic information is nonrival - it\'s marginal cost (cost after initial production) is basically zero. Because of this, there in fact might well be negative benefit to copyright, not positive benefit. In fact, he shows that the data shows that there is a decrease in information production with increasing patent protection. This is because the cost of more information production (which, of course, is based on previous information production) increases with patent protection and copyright.

He goes into a very interesting discussion of the matrix of strategies of information production: Rights-based, Nonexclusive-market, and nonexclusive-nonmarket types of production. He then discusses the strategies of each: like the Romantic Maximizer film director who sells work to a \"Mickey\" like Disney, or the \'Jane\' (he uses Joe, but I\'m taking liberties) Einstein sitting in her basement coding, releasing her software via copyleft to a Limited sharing network.

He then talks about these different types, and the revenues that they actually get that depend on copyright protection (not a whole lot.) He then says:

\"The difference that the digitally networked environment makes is its capacity to increase the efficacy, and therefor the importance, of many more and more diverse, nonmarket producers falling within the general category of [Jane] Einstein. It makes nonmarket strategies - from individual hobbyists to formal, well-funded nonprofits - vastly more effective than they could be in the mass-media environment.\"

What I took home from this chapter are two things: 1) in effect, in this networked environment, copyright and patent protection are, in fact, counter to the greater good of society (I knew that one already - but it\'s nice to have economic arguments to help) and 2) There is a lot of potential that is available to be harnessed from people who are doing things for a wide variety of reasons. Stay tuned for Chapter 3.

Continue Reading

Carnival of Nonprofit Consultants coming here!

On 27 Oct, 2006 By mpm

Next week, the Carnival of Nonprofit Consultants is hosted here! I\'m behind - I was supposed to tell you about this a few days ago! (I have a sort of good excuse.)

I don\'t have a theme - please send me your best stuff! You can email them to me. Because I\'m delayed, you have through Sunday evening. I\'ll be posting the Carnival on Monday afternoon.

Continue Reading

New to You Laptops: the series

On 20 Oct, 2006 By mpm

On my regular blog, for reasons that are mostly historical, there is tag-team blogging going on between me and a fellow religious blogger by the name of Scott Wells. The issue is - using used laptops with Linux (specifically Ubuntu) for cash-strapped churches or nonprofits. I realized that the series would be of interest here. So:

  • First, Scott starts off with the main issue: what kind of laptop would be good for running Ubuntu
  • I follow up with an answer sort of from the horse\'s mouth, and ask a follow up question about how to get people to use Linux.
  • He answers my question quite deftly, and asks some more questions
  • And I follow up with some details about why linux is so great for used computers.

We\'ll see how far this goes...

Continue Reading

Web 2.0 Part Vb:APIs

On 20 Oct, 2006 By mpm With 2 Comments

This morning, I sat in on the \"Great Open API Debate\" hosted by NTEN. First off, a tip o\' the hat to NTEN for organizing this, the participants of the panel for an interesting conversation, and Mark Bolgiano from the Council on Foundations for awesome moderation.

There were four perspectives:

If I was going to complain, I\'d say it was way for-profit vendor heavy (63%?). It would have been nice to have heard from a circuit-rider or \"for-little-profit\" integrator/consultant type person, and maybe another nonprofit type (a moderately tech savvy ED?)

There was also a backchannel chat room, which was really useful and interesting. (I hope NTEN can post the transcript.)

I\'m not going to go through everything they said at all - I think a recording will be available on the NTEN blog or site at some point soon, I expect. But I\'m going to highlight what interested me most, and talk a little bit about the zen of APIs.

So, first off, one of the interesting things was that there was some initial differences of opinion as to how to define open APIs, and what they were used for. There were two different kinds of APIs discussed - the ones that help organizations with interoperability within their organizational systems - getting data from one app to another, and using APIs for things like Google maps mashups. Also, basically, what made an API \"open\" was that it was free to use, and well documented. It seemed that only Blackbaud had APIs you have to pay for. The other vendors either supply their paying customers with APIs, or, in the case of civicspace, the APIs are, well, free and open, like everything else about open source. I have to admit that I went into this thinking much more about the more public types of APIs even though the APIs for internal data integration has been a real interest of mine, and a real concern, as well.

Zach brought up an interesting point, and I think it is worth highlighting. From his perspective (and mine, too) one of the big issues (as he put it, the elephant in the room) is how open APIs impacts the business model of vendors. It\'s no secret that some competition for the vendors comes from open source software, and perhaps that\'s a good thing - it provides them with the incentive to match features - like open APIs. One vendor said that very few nonprofits have the resources to take advantage of APIs. Someone (I think it was Nick or Peter) said that nonprofits can do more than vendors think they can - so providing the resources is important.

One of the other key things that I think was important is the sense that in some way, openness and security are at odds with each other. As both someone in the backchannel, and a couple in the debate said - security is important the minute you open one port to the world. You can have openness and security. I did feel that the vendors had a small tendency to indicate some concern about security when asked about openness. That bothered me a bit - they are totally different issues.

So what I got out of this was a couple of key points. This conversation would not have happened a couple of years ago, and that it is happening now is great (late, but great.) The sores on my head have only just recently healed (must have been seminary) from helping a couple of organizations a few years ago to plan for integrating some of their internal databases with a big vendor database that-shall-not-be-named. So I guess I should be happy that there were so many for-profit vendors at the table, and that this means that it will be easier for nonprofits to do the kinds of integration between applications that was nigh impossible a couple of years ago. That\'s good news.

APIs are here to stay. In fact, they are the future. The standards and technology have matured in such a way as to provide the potential for real richness in data integration, both inside organizations, between organizations, and with bigger, broader entities such as Google. The more that nonprofits understand them as useful to them, and demand them from vendors that provide (or build) software, the better, as far as I\'m concerned.

Continue Reading

On 19 Oct, 2006 By mpm With 2 Comments

  • For you Windows types, IE 7 is out, and they found a vulerability 24 hours after release.
  • Also, for you Windows types, here is a plain english interpretation of the Windows Vista EULA (End User License Agreement.) How about this one: \"You may not work around any technical limitations in the software.\" What else is it that us geeks do? You can\'t play mpeg-4 videos except under extremely limited conditions, and if you upgrade your computer more than once, you\'ll have to pay. So if you are a hardware geek, expect to pay MS every other time you get a new motherboard. And since they seem to upgrade their OS every 6 years or so ... I have a suggestion. Get Ubuntu, and have done already. Notice, I didn\'t say \"get a mac\". You could, and still be better off. Apple\'s OS is, of course proprietary, and Apple\'s EULA is a little less evil. And, you can run any windows software you want on it, at either native speed, or a bit slower in emulation. But you\'d still have to buy Windows. So you\'d still have the same problem. But if you really want to have done with stuff like this, get Ubuntu. It\'s the best flavor of Linux out there right now in terms of ease of use. (Some think that Microsoft is abandoning power users.)
  • A company going in the other direction: Eudora is going open source (no, they are not open sourcing old Eudora code, they are changing direction to use Mozilla Thunderbird as the underlying technology.)
  • There is a new site, called \"Campus Reader\" which aggregates feeds from college news sources. I like it. A lot. Anyone for \"Nonprofit Reader\"?
  • Yahoo and Microsoft have Google Envy. Is this news?

Continue Reading

Web 2.0 Part Va:APIs

On 17 Oct, 2006 By mpm

One of the best parts of Web 2.0 for geeks is APIs. These are Application Programming Interfaces, and they are a relatively new part of the way that Web 2.0 works. Like the freedom that RSS gives to end users in terms of getting the data that you want in your hands, to read when and how you want it, APIs give programmers (and, at times, end users) the freedom to get data from Web 2.0 services, like del.icio.us, google, flickr, and many, many others, and use and manipulate this data to their own ends.

One of the best examples of the use of APIs are Google Map mashups. These are using data in your own databases, and grabbing maps from google maps and putting them inside your application. Other examples include desktop applications that allow uploading photos to flickr.

Folks at NTEN have been thinking about APIs, and will host a discussion about them on Friday. I intend to be there, and listen in, and take notes, and post my opinions about the zen of APIs for nonprofits.

But I have a first take. I think that APIs are an expression of the best of what the internet is about. The free sharing of information in ways that allow for new innovation. It allows nonprofits free access to data that they would normally not have access to (like mapping data), or would have to pay a lot for. And if nonprofits, in making their own Web 2.0 applications, provided their data via open APIs, it would help other nonprofits, and the sector as a whole. I have dreams of applications that combine, say, available bed space in shelters and soup kitchens, all mapped for people to find. Or any other interesting combination of things. It\'s all possible if people freely share the data they share anyway on the web in an API.

Continue Reading

The Wealth of Networks, Part I

On 10 Oct, 2006 By mpm With 1 Comments

I kept hearing about this book. My friend Katrin over at NTEN told me about it first, then it kept popping up all over. The book is \"The Wealth of Networks\", by Yochai Benkler, who is a professor at Yale Law School. It\'s available at that link in a multitude of forms. I have it in nice, wonderful book form. I like portable that way.

Anyway, I should apologize in advance for overlapping series (or not, I guess.) I\'m not finished with the Web 2.0 series, but I really wanted to delve into the meat of this book, and blog about it. I think I\'m liking the book so much because it\'s an amazing combination of some of my favorite things: technology, law and economics. (No, sadly, no theology here, but I could probably find a way to weave it in.)

I\'ll start out with Chapter 1, which introduces the basic ideas of the book, and the importance of this particular moment. He lays out the beginnings of his arguments - that information and cultural production are central to human freedom and development, and that this new, \"networked information economy\" is providing a disruptive moment in time, and, with social action, we can use this new kind of economy to further human freedom, even as other forces are trying to create systems that will limit it.

He lays out some interesting concepts, things I\'d been aware of, but not really studied enough to articulate. He talks about how the motivations for information and cultural production are very often nonprofit and nonproprietary, and that as the costs of information production goes down, those motivations start taking the fore - they become more important. He talks about the ways that a networked information economy increases autonomy for individuals, and he deftly answers the critiques of the democratization of information that the networked information economy provides. And then he lays out the resistance of actors which he calls part of the \"industrial information economy\" that are working to limit this broadening effect on autonomy and freedom. He argues that we are going to have to work for this - it\'s not going to just happen because the technology provides these opportunities.

I\'ll be blogging chapter by chapter, probably. They are pretty dense, although I\'m having a great time with Chapter 2 already - it\'s nice to see empirical evidence for things I\'ve been thinking for a while.

Continue Reading

Metaphors

On 10 Oct, 2006 By mpm With 4 Comments

I\'m a fan of metaphors. Human beings use metaphors all the time to understand the world, and to frame it. Metaphors are powerful in terms of the way we think about things. Think about how powerful the use of the metaphors around the \"war on terror\" are, and how differently we\'d think about our world and our life if the prevailing metaphor were one of \"catching criminals who use terrorist tactics.\"

Software uses metaphors all the time as well. And we all know how organizations end up reworking their procedures and way of doing things when a new large software implementation happens. It\'s inevitable. It\'s impossible to shape software completely to ways of doing that already exist - so those ways end up being shaped by the software implemented. So doesn\'t the metaphor used for that software matter, then, in a nonprofit setting?

One of the big changes that has happened in the nptech space, besides the ways in which Web 2.0 has clearly changed the nptech providers community, is the increasing use of Salesforce as a nonprofit CRM tool. I know it was sort of in process when I was fading out, but I wasn\'t much aware of it. When I came back, I was surprised to see how many consultants and technology providers have jumped on the Salesforce bandwagon.

Of course, the use of business metaphors, language and procedures in the nonprofit sector in general is far from new, and it\'s been a trend I\'ve not liked much. And I know that many nonprofits use software designed for corporations and for-profit entities, such as Quickbooks (although I know there is the NonProfit Books variant.) For the most part, this has been simple exigency. There just isn\'t the number and richness of nonprofit software options out there that exist in the for-profit sector (except, of course, in fundraising software, but that\'s where the \$ is.)

Part of my perspective on this blog is a perspective I take in life: the means are the ends. If we adopt, in whole, or in part, the metaphors of the corporate world, whose basic fundamental goal is making a profit, and use it in the nonprofit sector, whose basic fundamental goal is making people\'s lives better, does that create a problem or conflict? I don\'t know how many people have raised this issue, but I think it\'s one worth raising.

Continue Reading

Web 2.0 Part IV: RSS

On 07 Oct, 2006 By mpm

RSS, is, in my humble opinion, a core component of the grease that makes Web 2.0 move. Open APIs are the second core component, and that\'s next up in part V (I think, unless something else comes up.) What is RSS? It stands for Real Simple Syndication (or, Rich Site Summary or RDF Site Summary depending on ones point of view.)

I have a number of things to say about RSS. First, I\'ll describe it, and how it works, and how to use it. Then, I have a very bold proposal that almost no one will agree with, but I think may, in fact be an interesting proposal to at least talk about (patience, patience, don\'t run down to the end \<grin>.)

So, we are in this new milieu of fast updating content - so how do we keep track of it? I don\'t want to have to go manually through each of my bookmarked sites to figure out what sites have changed, and have new content, etc. (although some people do this.) I want to know, all at once, what\'s new, what I should go and read.

So this is where RSS comes in. When new content is generated (whether this be a blog, new photos on flickr, new bookmarks in del.icio.us, new videos on You Tube, etc.) there is a way for those sites to add to a feed (which can either be a static file, or dynamically generated) that a \"newsreader\" or \"aggregator\" can read, and tell what is new (what you have not read). This feed is in XML - which is a not-so-new standard for self-explanatory information (it tells you what it\'s going to tell you, then it tells you.) You get that new content by \"subscribing\" to the content using some sort of RSS reader - it could be your browser, or a standalone desktop client, or a web service.

So, if you are new to using RSS, the first step is to choose a method of reading feeds. For a while, I was using a program for the mac called Newsfire. I tried a number of desktop clients, but in the end, a web based service worked best. I used Bloglines for a long time, but I am now sold on Netvibes. (Moving from feed reader to feed reader isn\'t too hard, because of the file format called \"OPML\" - you can export an OPML file from one client, and import it with all of your feeds to a new reader.)

One of the great things about RSS is that, well, lots of things can be feeds! I am subscribed to people\'s flickr photostreams and the nptech tag at Technorati (which is, of course, itself an aggregation of feeds), as well as many blogs and news sources. Most newspapers and magazines have one or many RSS feeds to subscribe to. Many sites allow you to create dynamic feeds from searches, and you can see what new content that fits that search criteria becomes available. Google news is a great example of this. You can do a Google News search on, say, \"Peak Oil\" and then you can get, in your aggregator, anything new that is posted with those terms.This is really useful if you are keeping track of certain kinds of news.

And there are new kinds of aggregations. Many blogs allow you to add feeds that end up being blog posts (like del.icio.us bookmarks, for instance.) And there is an interesting thing called \"SuprGlu\" which allows you to aggregate as many feeds as you\'d like. (See my page for an example - it combines both blogs, flickr, and del.icio.us. It\'s pretty interesting. RSS, and these kinds of tools create opportunities for communities to create aggregated feeds, for individuals to put their work together in one place, etc.

RSS is a powerful tool, and it is useful both as an end-user to gather information, connect yourself to sources of information and people. And it\'s a very useful tool for nonprofits to get their information out to their constituents.

So this is where my bold proposal comes in. It might seem too risky or bleeding edge for organizations to do right now, but who knows. I would love it if every nonprofit that I got an email newsletter from, or a request for donations, or a news item about a new campaign or program, would just stick it into a feed, instead of sending me an email. I don\'t really want any more emails. I want to go to my netvibes page, and see, right there \"oh, Move On has a new ad, and HRC is starting a new campaign and ...\" It would save them money on bandwidth and expensive email newsletter services. I would get the message anyway, and I might even be more likely to respond (usually, I get so annoyed about getting yet another request for money in my inbox, that it goes right in the trash.)

I know I\'m not very representative of the general population - or even the population of those people who are donors. A lot of organizations (especially the big ones) do have RSS feeds on their sites - for blogs, or news, and such. But I\'ve yet to hear about a concerted effort to move people from email newsletters to RSS - and that\'s what I\'m looking for.

I think this is an example, from my perspective, of newer might actually be better. RSS is an incredibly powerful tool, one that provides more opportunities for individuals and organizations to gather more information, and work together. Imagine what a site with the aggregated feeds of many of the organizations that are doing, say, human rights work, would look like!

Continue Reading

Catching up part II

On 03 Oct, 2006 By mpm With 2 Comments

Catching up with the nptech field over the past few weeks has been a lot of fun. I\'ve even been mentioned in a few places (including Beth\'s blog, and the blog I will describe a bit below), which feels good. I\'m glad people are liking what I\'m saying.

I\'ve added a bunch of new feeds to my netvibes page (I\'ve got a whole tab devoted to nptech blogs - if you are a netvibes user, or want to try it out, click here to add my tab to your netvibes page.) and I\'m enjoying reading the wide variety of things people are thinking and doing, and I\'m looking forward to my minor (or major) forays back into the field. (If you want any details about what it is I\'m doing with my life, and where it is taking me, check out my main blog for that info.)

I\'ve learned about a couple of good nonprofit-related blog virtual events: the Nonprofit Blog Exchange, and the Carnival of Nonprofit Consultants.

I\'m involved in the Nonprofit Blog Exchange, and I drew one of my favorite new (for me) blogs: (East coast) Michael Stein\'s Non-profit Technology Blog. It\'s one of my favorites because he and I seem to share some similar sensibilities about technology, but he covers a very different range of issues than I do - so I get to learn things. He has a interesting, good, and informative post on website accessibility - something that, unfortunately, is something I haven\'t taken as seriously in the past as I should.

Anyway, I will be commenting on occasion about the nptech field as I see it now, and how dramatically it seems to have changed in just my year and a half or so away from it.

Continue Reading

IPv6

On 29 Sep, 2006 By mpm

So, this blog won\'t be totally technology zen. Sometimes, I\'ll talk about technologies I think are just cool, and useful, and, well geeky, \'cause I can\'t help being a geek.

IPv6 is the next generation Internet Protocol. That is, basically, the addressing system computers and routers and such use to direct traffic around a local network, and the internet. IPv4, the last version, was thought up in the world that existed when PCs didn\'t exist, really, and no one could even imagine that, well, you\'d want to give your refrigerator an IP address. It allowed for 4,294,967,296 addresses. Which, on one hand, seems like a lot, but it\'s not, when every cell phone, PC, router, cable set top box and toaster has one.

So in comes IPv6, a different kind of addressing protocol. It allows for 5×10^28\ ^ addresses, which, for those interested, is 50 octillion. It will likely even manage to make it into space, I think.

One of my favorite people who makes geeky stuff understandable is Carla Schroeder. She wrote a great series of articles for O\'Reilly about IPv6. They are worth a read.

Continue Reading

Web 2.0 Part III: Blogs, Podcasting and Vlogs

On 27 Sep, 2006 By mpm With 1 Comments

When I start out these series, I seem to have an idea in hand about how to organize them, which, invariably, gets rearranged in the course of writing. Such is life. I had originally planned to talk about RSS/XML after tagging, but I decided instead to hold off on that as a start on the posts about the inner guts of Web 2.0. So, here\'s the post about Blogs, and their follow ons: podcasting and vlogging.

Blogging is old hat. I\'ve been doing blogging for almost 4 years now. I wrote a blogging module into my web database project, Xina, more than 4 years ago. I have pretty much always understood the difference between blogs and websites - but I recently got a better feeling as I was redoing my own website. It\'s not so much about depth and breadth, although that certainly can be a part of it. It\'s more about the ephemeral versus the enduring. Blog posts get old, and out of date fast. That\'s part of the point. Websites shouldn\'t. Which, of course, is why many people and many organizations don\'t need blogs. But that topic will wait for a few paragraphs - let me finish my descriptions first.

There has been a lot of talk about nonprofit blogging in the last while. Most recently, Michael Gilbert pointed me to a very good whitepaper by Nancy White about blogs and community. It\'s worth a read. She has some interesting things to say about the emergent properties of blogging communities. At this point, many nonprofit technology providers have blogs, and use them to get their message out (and, I think, create an interesting community that is somewhat changed from the community I knew pre-seminary, which was primarily fueled by email discussions.) The originators of blogging probably thought of it mostly as a way for people to be able to easily update their websites quickly, and provide interesting content on a moving basis. I think the community aspects of blogging were somewhat unexpected.

Their natural follow-ons, podcasting and vlogging are not as ubiquitous, or as frequently used in the nonprofit technology space. Beth Kanter has been doing some great coverage of the emerging fields of blogging, podcasting and vlogging (she has a fabulous linkroll of blogging how-to\'s on her blog.) Podcasts are simply audio blogs that were downloadable, and you could put on your favorite digital audio device (hence, \"Pod\"casting). Vlogging are video blogs - and they are as simple as a talking head in front of the camera, and as complex as including animation and other things.

It could be argued that iTunes made podcasting mainstream. But without a doubt, YouTube made vlogging, and mass video creation mainstream. And the major engine that makes these three types of ephemeral media really work, is RSS, which is the subject of the next post.

So, now the question - should a nonprofit organization have a blog? Should staff of a nonprofit blog? Would this help: 1) gain donors? 2) communicate the message? 3) keep stakeholders informed? 4) provide collaborative opportunities within, and between organizations?

All of these are good questions, and will be totally different for different organizations. I can think of two organizations that I\'ve worked with, which are, in a sense, case studies for why to have a blog, or not to have a blog.

Organization 1 is a medium-sized mental health organization in a smallish city in the Northeast. It gets most of its clients by referral, and just about all of its funding by state or federal contract. It has really defined policies and procedures. It continues to grow, but is growing in well-defined ways, that mostly don\'t require communication with many stakeholders.

So, should this ED start a blog, or should the organization have a blog? Unless the ED wants to provide some kind of leadership in the mental health or nonprofit space, this ED doesn\'t need one, and neither does the organization. The time and effort it would take to maintain a blog isn\'t going to result in any better accomplishment of mission. (Actually, they don\'t even have a website. Which is just fine.)

Organization 2 is a small pro-choice membership organization that depends upon outside funding, has many stakeholders in many different communities, and provides advocacy and activism nationally. Should this ED, and/or this organization have a blog? Heck, yes (in fact, it was for this organization that I originally wrote the blogging module that I mentioned above.) The time and energy that it devotes to their blog(s) (yeah, they should probably have more than one) would likely pay off in the short, and long run.

But there are many, many organizations in the middle of these extreme examples. Blogging takes time, focus, and energy from someone or someones. And it only makes sense if the connections that can be made, the communication channels opened, the voice heard is worth that investment.

As for podcasting and vlogging. I\'m much, much more bearish on those technologies (oooh, something I can be bearish about. \<wink>) First off, both of these (particularly vlogging) take an order of magnitute more time and energy to produce than a blog. And they likely have an order of magnatude less audience. I\'d argue that it\'s likely that only organizations who\'s major focus is technology or media, or who are large enough, and have enough audience (like an Oxfam, or a Greenpeace) should tip toe into this territory.

And, I\'d argue, the stakes are higher for an organization than an individual that starts a blog, or podcast, or vlog, and then decides later to stop. I think it might be better not to start at all. But it does require a lot of thought. Look at what organizations like yours are doing. Look at what kinds of things you can do to your website, for instance, to create RSS feeds for new content, instead of thinking of starting a blog.

It is my not so humble opinion that, like many technologies, simply the presence of them provides pressure for some to adopt them. I\'m an early adopter, I know - it\'s easy to feel like everyone\'s doing it, and maybe I should look into it. Or whatever. But like any technology decision, it requires thought about how useful that technology will be, and whether, and how, it will serve your mission.\

Continue Reading

Web 2.0 and database technology

On 24 Sep, 2006 By mpm

I\'ve been beginning to think a lot about databases, and where they are going. I\'ve been using databases now since grad school, and relational databases for the past 10 years or so. There have been two specific advances in Web 2.0 that might, in the end, change how we think about databases.

This is described well in a post on O\'Reilly Radar, which describes what Google did when it was creating a new bug tracking system. They, of course, have the worlds most kick-ass full-text searching system (I\'m not sure whether that\'s Web 1.5 or 2.0.) So they combined that system, with specific kinds of tagging and metadata, to decrease the structure of the database of the bug tracking system - they were encouraging people to just put in lots of text in a free-form field.

It made me think - how many kinds of databases that we create and use could be simplified by adding tagging, and really good full-text searching? I already can imagine something like an event management system, or some kinds of content-rich applications that depended upon highly structured relational schema, that could use this kind of new idea. Come up with one good full-text and metadata search functionality (or use someone else\'s) and trim down the time and energy both creating the schema, and entering in the data, at the same time as you enrich the content.

I kinda like it.

Continue Reading

Web 2.0 Part IIa: Social Bookmarking

On 24 Sep, 2006 By mpm With 9 Comments

After writing my post on tagging, I got sidetracked by Marnie Webb\'s mention of ma.gnolia, and then went off to investigate, then decided to write about social bookmarking tools. Ma.gnolia is a new(ish) social bookmarking tool. There are some interesting comparison\'s out there - see Notmyself, Phil Crissman, and Jeff Croft for a good review of Ma.gnolia\'s open API.

First, I\'ll do a quickie review of the social bookmarking phenomena and why I\'ve been using del.icio.us, and why I\'m switching to Ma.gnolia. And then, I\'ll ask myself some questions about it.

So, all browsers keep bookmarks - it helps one easily go back to and find sites that you go to regularly. These days, most browsers have a nice bookmark bar - that really helps organize sites you visit regularly. But what about sites people who do the same kinds of work that you do, or like the same kinds of things that you do? You could google for them, but wouldn\'t it be great to see other people\'s links - things they found organically? Also, wouldn\'t it be great if no matter where you were, you could get to your bookmarks?

So those are the reasons I use a social bookmarking tool, like del.icio.us. (Here\'s a pretty complete list of tools from listible, another kind of social bookmarking tool.) I have to admit that my reasons have more to do with the latter (getting at my bookmarks in an organized fashion (i.e. tagged) from anywhere,) but I do like, on occasion, to find people who are kinda like me, and find out what their bookmarks are - and I like contributing my bookmarks. Which, in the end, is why I\'m choosing to switch from del.icio.us to Ma.gnolia. I like the interface better, and the social part of the bookmarking is actually a lot better done (they have groups, as well as contacts.)

But some comments on the whole phenomenon. First, the major problem is that there are, at this point, so many of them, and although many (most?) of them have open APIs (that is, they allow other software to interact with them, and grab data, or add data,) they aren\'t really interoperable (see Marshall Kirkpatricks excellent post on issues regarding Yahoo and del.icio.us and other of it\'s acquisitions - adding another good reason to switch away from del.icio.us.) In the sense that there isn\'t a way to, for instance, add the same bookmark to several social bookmarking sites at once (there is, however, a cool greasemonkey script that allows you to copy del.icio.us bookmarks to ma.gnolia.) You basically have to either decide which site has your loyalty, and then stay with that one (or spend a lot of time importing and exporting and double/treble/quadruple bookmarking. It should be interesting to see how this plays out. del.icio.us clearly has had the lion\'s share of attention for a while, but who knows how long this will remain.

The next question is, well, how useful is this anyway? In some ways, I use social bookmarking tools like listible as a more directed google (I\'ve yet to create any lists). I use furl to keep pages that have content that I absolutely want to keep, in case the site goes away (furl could go away, of course, so maybe I should save that stuff to my hard drive - they have a cool export feature.) Bookmarking saves me time, for sure. But it\'s also true that a lot of the \"social\" in social bookmarking has been more of a time suck than a time saver.

But, as I\'ve said, it\'s not all about efficiency. Does it really connect me to people? Sometimes, but not generally. In general, at least del.icio.us\' focuses more on the metadata aspects of the social part of social bookmarking (centered around tags). Ma.gnolia seems better. We\'ll see. But in the meantime, social bookmarking tools are I think a useful part of Web 2.0.

Continue Reading

Web 2.0 Part II: Tagging

On 21 Sep, 2006 By mpm

The first set of new technologies that I\'ll talk about that are part of Web 2.0 is something called \"tagging.\" Tagging isn\'t really a technology at all. It\'s really a new method of keeping track of metadata. It is a key part of all of the best Web 2.0 tools out there which are about collaborative content creation, like del.icio.us, flickr, listible, and others.

What makes tagging special, I think, is that the tags chosen are totally up to the user. And that helps to create what are called \"folksonomies\" - collaboratively created categorizations of information. This is in contrast to most previous techniques of categorizing data - some individual or organization came up with a way to categorize things, and individuals had to conform to those categorizations. Folksonomies are the result of many individuals choosing, with or without influence, tags to use.

Many nptech (the favored tag for the Nonprofit technology community) folks have talked about tagging and folksonomies, and there are some interesting projects afoot (keep track of the netsquared project - it\'s all about Web 2.0). I think the activity around tagging has mostly settled down - tagging has become mainstream, and part of everyday tech life.

From my perspective, tagging is one of the most important new features of Web 2.0. It promotes democratic, collaborative content generation, and makes it easier to find information that you want or need, based on the way that you look at things, not based on someone else\'s way of categorizing information you need to learn. It connects people. I love following other people who use some of the stranger tags that I\'ve come up with - it turns out that a lot of their bookmarks are useful to me.

I do think that like all technology - tagging has it\'s limitations. It\'s not going to change the world. Tagging is, in the end, about bringing people together, and empowering people to be creative. The results of that creativity might be, well, not what you\'d hope. But it is one of the things that has made Web 2.0 head and shoulders better than Web 1.0.

Continue Reading

Google Analytics vs Site Meter

On 18 Sep, 2006 By mpm With 8 Comments

Yes, I promise, the post on tagging and folksonomies is coming. But first, a great example of Web 1.0 vs Web 2.0 - I wanted to talk about Google Analytics. I found this by way of one of my favorite new blogs, Lifehacker. Lifehacker is great, and shares some of my ethos about technology. They had a link today to a great page: how to dissuade yourself from becoming a blogger. It\'s funny, and appropriate. I think some nonprofits should read it. Anyway, being a poor student, and having a few extremely low-traffic sites, I figured I\'d stick with Site Meter, which seems to be the best of the free site analysis tools. It gives you all of the necessary stats: page hits, visits, referrers, some nice geographical info, etc. Google Analytics is also free. You can follow 10 sites instead of one. And it does all of the same stuff, except better. And you don\'t have to have that silly cube graphic in some far corner of your site. (Some of what Analytics does I don\'t even understand yet.) And in terms of interface, it blows Sitemeter out of the water.

Continue Reading

Web 2.0 Part I

On 09 Sep, 2006 By mpm

I liked doing the Intellectual Property series in the earlier incarnation of this blog. Writing a series I think gives me the space and time to think about particular technology issues in way more detail than I can in one post, and Web 2.0 is a big enough topic that it really lends itself to a series. So this is the beginning of a series of posts on Web 2.0. What I\'ll do in these posts is first explain a bit about one particular aspect of Web 2.0, and then talk a little bit about it\'s implications in the nptech field, and then my own view of it from the neo-luddite perspective.

Before I plunge in to talk about the individual parts of Web 2.0 that I will highlight, I\'ll give you a short definition of what Web 2.0 is. The Wikipedia entry on Web 2.0 is quite good, so if you want more detail, certainly go there. But I\'ll give you my quick definition:

Web 2.0 is a series of innovations in web technology that have come together in unexpected ways, to change the experiences that people have in using the internet, and has made it much more deeply a many-to-many experience, rather than the more one-to-many experience it had been before. The technologies generally connected to Web 2.0 include, depending on one\'s definitions, many new kinds of communities such as MySpace and Flickr, blogging, podcasting and vlogging, tagging and folksonomies, RSS feeds, content rich web applications using technologies like Flash and Java, open standards and APIs that allow seamless connections between different web applications, new kinds of user interfaces using AJAX, and different design aesthetics. Hallmarks of Web 2.0 sites include a democratic approach to content, organization by tagging, and new, much more flexible and intuitive interfaces.

At this point, I use Web 2.0 applications every day. I blog, I use Flickr, I search blogs using Technorati, I use del.icio.us and tag my links, I contribute content to a number of sites, including H20 Playlist. I think Web 2.0, like it\'s version number suggests, is a much richer, more rewarding experience than Web 1.0 was.

And, I think that there is a lot that the nptech field can get from using Web 2.0 tools - since in many ways, the most important aspects of Web 2.0 are about empowerment of individuals, and connecting people to each other by the content that they create, or are interested in. But there is a lot of hype regarding Web 2.0, and I want to talk about that hype, and talk about the possible pitfalls of jumping on the Web 2.0 bandwagon. Jumping on any technology bandwagon has its pitfalls, and this one is no different.

So, what\'s on tap?

First up, after this post, will be an investigation tagging and folksonomies. Then, I\'ll talk about RSS and XML. These are, I think, the two most important aspects of Web 2.0 in terms of their positive impact. I\'ll then talk about blogging, podcasting and vlogging, which I think are probably the most hyped, and potentially least useful for nonprofits to jump into without a lot of thought. After that, I\'ll go under the hood, and talk about things like open APIs and AJAX.

Continue Reading

The language we use

On 06 Sep, 2006 By mpm With 2 Comments

I came across, in my catching up period, an article titled \"Ten ways to change the world with Web 2.0\" It\'s actually a great article, by Marnie Webb of Compumentor, who I think thinks cool thoughts, and does cool things.

I got on Beth Kanter\'s case a while back, when she was posting about  \"Tagging for social change.\"

One of the things I said to Beth was:

I have to admit to some hesitance even thinking about a phrase like \"Tagging to make social change.\" There is no question that technology in general has created sea changes in the ways in which organizations get and use information, reach donors and constituents, create campaigns, etc. But I think the jury is still out, at least from my perspective, on whether or not this sea change in communication has actually resulted in very much on-the-ground social change. Are there actually really any fewer homeless people? Did MoveOn actually manage to help elect someone more progressive? Is the environment any cleaner?

I do think the jury is still out, and I think that using language which raises expectations about any one single new technology (like Web 2.0) is not a good idea. Yes, let\'s talk about how Web 2.0 is doing a great job at bringing people together in ways that are new for this newish media called the internet. But lets not fool ourselves into thinking, without proper evidence, that this is going to be any more effective than lots of old technologies non-profits have been using for years.

Continue Reading

Other neo-luddites and interruptive technologies

On 04 Sep, 2006 By mpm With 1 Comments

One of the nice things about catching up with the field is that you get to aquaint yourself with people who you\'ve just heard of, but never met. This includes the \"East Coast\" Michael Stein. (I have worked a little with the \"West Coast\" Michael Stein.)

He has a great post about interruptive technologies, like phones, text messaging on phones, and IM. He says:

Observing my response to these two items helped me understand the Amish response to the ubiquitous telephone. I\'m never without my Treo Smartphone. But I wouldn\'t [dream]{style="font-weight: bold; color: rgb(102, 0, 0);"} of answering it during dinner, and I often let it go to voice mail during the day. As my coworker Krista says - \"the phone ringing is an invitation, not a command. \" People talk about \"disruptive\" innovations - seems to me the mobile phone as a highly \"interruptive\" technology that needs to be controlled. Unlike email, for example, that you can check when you are ready to.

It is a set of interesting questions. I am much better at not answering my cell phone when I don\'t want to, than I am at keeping my IM off when I want to concentrate. But I like that idea. To some extent, it is true that interruptive technologies like IM and cell phones do increase my tendency toward multitasking in a way that probably leads to less awareness, not more.

Michael Stien might not like being called a \"neo-luddite\", but my definition is anyone who asks questions that potentially makes us think about our assumptions about technology, and it\'s present course.

Continue Reading

The Blackboard patent

On 03 Sep, 2006 By mpm

For some of you, this is old news, but in the process of catching up, this came to my attention. Last year, several of my classes in seminary used Blackboard, which is the major player in the e-learning space.

In the patent office\'s completely non-infinite wisdom, they granted an extremely broad patent to Blackboard for e-learning. This means that it\'s competitors, which include both commercial as well as open source software, are theoretically in violation of those patents. And they are beginning to sue.

There is a really good review, with lots of great links, on O\'Reilly radar.

So what\'s the problem from my perspective? I\'ve talked about patents before in this blog (in it\'s previous incarnation). My perspectives haven\'t changed much. Education is not a luxury, to my mind. I\'m not especially a fan of Blackboard, but what\'s true is that it was helpful for the courses I was involved in. And, further, for distance learning, it\'s essential. The patent office giving Blackboard what amounts to a monopoly position in the e-learning space is bad enough. Blackboard choosing to enforce that monopoly simply stinks.

Join the boycott.

Continue Reading

Catching up

On 01 Sep, 2006 By mpm With 2 Comments

It\'s not until I\'ve spent a little time reading a wider array of nptech blogs that I have realized how much has changed in the last year or so, since I was last really imbedded in the field. I\'m hoping that will provide some very interesting things to blog about over then next few weeks, as I regain my footing.

Obviously, the biggest change is the ubiquitous nature of Web 2.0, and the ways it\'s made itself into the nonprofit sector. I think that a lot of Web 2.0, particularly RSS and folksonomies, are aspects of Web 2.0 that are incredibly helpful, and provide things that really do connect people, and help people decide in a much more granular way what they want to read, and have access to.

Some of Web 2.0, though is more hype than useful. How many nonprofits really need to have a blog? Sure, I\'d love to see more nonprofits move from sending their newsletters out by email, to getting them into an RSS feed, which I can choose to look at, or not. Otherwise - I think it depends a lot on the mission of the organization, for sure.

Anyway, I\'ll have more to say about Web 2.0 soon, once I finish catching up.

Continue Reading

Laptops in schools

On 31 Aug, 2006 By mpm

Slashdot, in it\'s standard inimical way, poopoos parents who opposed giving students laptops.

Schools giving away laptops are hardly new. It has been considered a good idea to give students laptops, to give them access to the wealth of resources available, learn about technology, learn to make ... powerpoint presentations ...

It turns out, a lot of students spend time playing things like World of Warcraft, or cruising the net, instead of using their laptops to actually learn something, like ... powerpoint.

OK, I imagine you are getting my drift here. Does it make sense to give every student their own laptop? A while ago, it was about getting students ready for the new workplace (ugh), and bridging the digital divide (a laudable goal.) Now, in the US, for the most part the digital divide isn\'t between the economic haves and have nots, but mostly between the technology want-ers and want-nots.

My suggestion? Something old. Computers in classrooms where they make sense, and in the library. Provide help to parents who really can\'t afford to buy their kid a computer or a laptop. Leave it at that.

Continue Reading

The Resurrection of a Blog

On 31 Aug, 2006 By mpm With 2 Comments

Blogs die, blogs come back, new, different, and informed by experience. I\'ve lived a year without doing much technology work or being involved or engaged in the nonprofit technology field. I\'ve lived a year thinking about spirituality and religion, reading sacred texts, and living and talking with people whose lives are centered around the divine, and the heart. I\'ve lived a year knowing that the most important thing in my life is my connection to the divine/ultimate reality/my highest self. I\'ve had a benefit of distance, as well as the benefit of a year of thinking about what\'s really important to me in life, and what I think is important in the world as we are going to enter some difficult times.

So what is this blog going to be? Think of it as a place where you\'ll hear hard questions asked, assumptions questioned and technology trends dissected. A place where technology is more about connecting people, than making work more efficient. Where technology is more about making our lives more interesting and creative, instead of making a buck. And, because of that, it will be a place where you\'ll hear that it might just be better to sit in a room with people than send them email. Or it might be better to throw a fundraiser instead of putting a button on your website. Or better to get up out of our chair, instead of sitting in front of a screen. A place where the default is slowing down, not speeding up, Staying still, not upgrading, a place where less is more.

I\'ve been an early adopter pretty much my whole life, and I have spent an inordinate amount of time in front of computer screens. And part of what you\'ll hear here is me asking questions of myself. Why is it that I want X that new technology gadget with five gazillion cutting edge features?  Why is it that I read fifty gazillion blogs in a day? What does it add to my life? What does technology really add to my life?

I hope that it will provide, at least, some food for thought.

Continue Reading

Google Talk is Jabber!!

On 25 Aug, 2005 By mpm

I heard the announcement about Google\'s Talk, and I did a quick perusal of the website, and saw that they were only going to release a Windows client, and I decided not to bother with it. First, I was annoyed, as usual, with Windows only clients. And then, based on what I\'d read in the mainstream press (I actually only read one article in the LA times last weekend,) I figured that this was going to be too little, too late, even for Google. But then, I then found out from another blog that the protocol Google talk is using is Jabber! For those of you who are not familiar with Jabber, here\'s their good basic info page.

This changes everything. Jabber is an already established open protocol, that has been a favorite of geek types for a while. There are multiple clients available for all platforms. There are all sorts of cool ways that developers can extend this.

I am connected with Google Talk (talk.google.com) using my Mac OS X iChat client and my gmail username.

It might take a while for it to take hold, and take over AIM, which, from what I can tell, is the most popular IM system, but it was an incredibly smart move for them to do this - it also allows them to do very interesting things with gmail, and google itself, with the kind of interoperabilities they are building.

Google will take over the world. I just hope they don\'t become like Microsoft in the process. They seem to have been making the right steps, keeping things open each step of the way, unlike MS, which is the king of closed, non-interoperable, non-standards compliant development.

Continue Reading

IP Tidbits

On 18 Aug, 2005 By mpm

Here are a few tidbits I\'ve come across in the Intellectual Property arena in the past few days.\ \ Downhill Battle, which is an organization people interested in the whole \"copyfight\" issue should know about, has a new project, called Participatory Culture. They\'ve just released a beta version (sorry, Mac only, for once) of a new platform for internet video, called DTV. This is very cool. It makes finding channels with interesting video easy, as well as making channels easy. It\'s definitely a thing to watch. It might indeed make vlogging a lot easier as well. I\'m looking forward to the Crooks and Liars channel! I\'m going to watch this pretty carefully.

There is a very interesting PDF floating about with a powerpoint presentation by the CEO of the RIAA about the copyright/filesharing, etc. issue as they see it. Uck.They still don\'t get it. But I guess they won\'t, given their position.

There is a new, interesting project under Creative Commons  license. It\'s called Orion\'s Arm, which is a huge collaborative science fiction world-building project. It looks pretty amazing - and a great testament to what open source licensing can do for creative work.

Continue Reading

Ubuntu

On 17 Aug, 2005 By mpm With 2 Comments

I\'m trying Ubuntu Linux on an old compaq laptop I have (and brought with me to California.) It\'s an old Compaq Armada (m300) that I bought used last year, and weighs about 2 pounds without the accessory bay. It was pretty cheap when I bought it, but it must have cost a fortune when it was new. I\'ve installed regular Debian on it, plus a couple of versions of Fedora.

I\'ve been hearing all sorts of good things about Ubuntu, and I figured it was time to try it out. Here\'s my basic experience and review of it.

The most recent version of Ubuntu is 5.04 (Hoary Hedgehog). You can get it from their download page. They have regular ISOs, bittorrent files, and jigdo files. They\'ve got some good mirrors, because the regular ISO download isn\'t too slow.

I am intimately familiar with Debian, and Debian installs, but I\'m going to write this as if I wasn\'t - I think that would make it the most useful.

The first part of the install process (basic configuration, partitioning and base system install)  is very straightforward - there were few choices to make, the hardware was detected flawlessly, and the install went easily. I kinda went away after the first reboot, and was greeted with the login screen when I came back. No intervention was necessary. Easier install than Windows, I think.

A few things were a bit odd - for example there wasn\'t a request for a root password - the default root password seems to be the same as the password for the single user account that was set up during install.

Gnome is the default desktop, and the only one installed by default - I\'m a KDE fan. I switched my desktop environment in a way that I\'m familiar with (install the kde packages, then change the default desktop manager in /etc/X11.)  Kubuntu, which is the sub-project to bring KDE to Ubuntu, seems really nice - and if I\'d read the Kubuntu page first, I would have had an easier time switching to KDE.

The basic add/remove applications interface is nice, and the advanced panel has everything. The configuration editor is not really intuitive, but for those who don\'t like the command line, it\'s an improvement over basic CLI configuration.

All in all, I\'m pretty happy with Ubuntu so far - the ease of install and configuration, matched with Debian\'s ease of software update, etc. We\'ll see how it works when I try to set up development environments (both Postgres/Perl and Ruby for Ruby on Rails) but I can\'t imagine, given the Debian base, that I\'ll run into trouble.\

Continue Reading

Small Notes

On 10 Aug, 2005 By mpm With 1 Comments

Since I\'m travelling, I haven\'t had much time to think in depth about much of anything. However, in my snippets of time reading my blogroll, there are a few technology snippets that I\'ve come across that are interesting.

Beth Kanter has been investigating vlogging - that\'s video blogging. It\'s pretty interesting - and I imagine that once high bandwidth connections are truly ubiquitous, vlogging might get pretty popular. But the barriers to entry are pretty high. I\'ve done a little video editing, etc. myself, and the time and energy it takes to do it well is pretty daunting. I think I\'d be lucky to get one or two vlog entries a year actually done. More power to Beth, though! And great to start thinking about this new technology and how nonprofits might use it. I also have been watching her coverage of Blogher - something I just learned about, and sounds really interesting. I wish I\'d known about it sooner.

There are rumors that Palm is going to move off of the Palm OS platform, to possibly to Windows Mobile. Case in point - the new Treo 670 has been seen running Windows Mobile. Anyway, this is all rumors and innuendo, but the truth is, if Palm moves to Windows Mobile, I hate to say it, but I dump Palm. Sad but true. I\'ll hold on to my Treo 650 until it falls apart, then figure out what to do next.

Unintended consequences: The new energy bill has in it a change in the daylight savings time, which hasn\'t changed since 1987. So there are a fair number of devices that have programmed in them the old schedule. I imagine this will be worse for devices not connected to the internet in any way. Could have some interesting side effects.

I\'ve been pleasantly surprised in my travels how much I\'ve been able to use wifi, most places. Most of the chain motels are sporting Wifi, tons of cafes and the like are, and even rest stops seem to have wifi. I think the days of ubiquitous Wifi are coming. The question is, ubuquitous free? Will that grow, or will it move to become ubiquitous, but you pay for it? Not clear yet.

Continue Reading

More patent office silliness

On 10 Aug, 2005 By mpm

OK, this is great. At the same time as the patent office is granting business method patents that everyone knows have tons of prior art, they are busy rejecting trademarks, based on who knows what, exactly. Case in point: the organization \"Dykes on Bikes\" was denied a trademark of their name because the word \"dyke\" was vulgar. Nevermind that dykes have used that word in a positive self-affirming way for 30 years now, and there is tons of evidence for that. (I mean why call your organization by a name if you didn\'t think it self-affirming??)

LawGeek has a nice little rant on the issue - much more in depth than I could provide. Boing Boing also has a post on this.

So when is the USPTO going to get their act together? I\'m not holding my breath. I think what I might be waiting for is for the whole system to fall in on itself.

Continue Reading

No, really? A wifi hotspot on a bus??

On 28 Jul, 2005 By mpm

I just found this very cool tool, called MacStumbler.  It\'s a wifi network sniffer. It will tell you what networks are around, and whether they are open or not, and what\'s their strength. It\'s useful to troubleshoot home hotspots, and find ones out in the world.

So I\'m sitting at my now most favorite free wifi hotspot (Bart\'s in Amherst - it\'s quiet and easy to find an outlet) and I happen to be looking at MacStumbler at the same time as a bus drives by. The busses around here are called PVTA - for Pioneer Valley Transit Authority.

Anyway, as the bus rolls by, I see \"PVTA_0333\" come up on the MacStumbler, then leave! Yeah, really, really! It reminds me of the time a friend of mine and I were driving to Rhode Island on the highway. I had her laptop open, and I saw a \"Ford_F350\" network come, and go.

There is something called a \"Stomp Box\" - a wifi hotspot connected to a cellular 3G network. I think it\'s a great idea for, say, an RV or something. But a car? Hmmm.... So when does this become ubiquitous? Maybe faster than we think.

Continue Reading

Social Source, Open Source, Socialism

On 27 Jul, 2005 By mpm

David Geilhufe has a new post on his blog, entitled \"Social Source Socialist?\" It raises an important issue, and I\'ve been wanting to talk a bit about the ways in which open source software in the nonprofit space is related to our economic system. This is pretty airy-fairy pie-in-the-sky stuff, but why have a blog if I can\'t do that?

The nonprofit sector (often called the \"Third Sector\") is primarily (although not exclusively) geared toward the betterment of human lives. The \"First Sector\" (or is it the \"Second Sector\"? I never know.) which is capitalist enterprise, is primarily (although not exclusively) geared toward maximizing profit. It is true that this sector provides some betterment of lives based on employment, but as we\'ve seen lately, this tends to mean paying as many people as little as possible, leading to their need of Third Sector services. The real betterment of lives this sector provides, at least in the last 20 years or so, has been the very few at the top.

One of my pet peeves, over the time I\'ve worked with nonprofits, is the extent to which they\'ve incorporated business (that is, capitalist enterprise) processes and ethos into their operations. I have to say that one of the most disheartening and troubling things that I come across is nonprofits that see other nonprofits as their \"competitors.\" How is it that nonprofits \"compete\" to better all human lives? Doesn\'t this detract from what the core ideals are? I understand the reasons that this happens - decreasing revenue, and competition for the same private donors, foundation grants, and government grants.  But I wonder if it is really necessary as many nonprofit \"gurus\" say it is.

It has always been my argument that the way an organization does its work is as important as the work it does. Corporate practices (competition, resisting unionization, efficiency as means of maximizing revenue, etc.) are usually not consonant with the goals and ideals of most nonprofit organizations.

So this leads me back to software. How could changes in the ways that nonprofits look at the way they do their work, and where they get their software change the kinds of software they use? If nonprofits thought differently, more collaboratively, a natural outgrowth of that would, I believe, be collaborative IT projects, leading to the kinds of economies of scale that large nonprofits (or corporations) can achieve. And it might lead to rethinking the use of closed-source commercial software in favor of open source software that can benefit the commons, instead of the few.

David says:

\"Shouldn\'t technology enable nonprofits to do more and to do it more effectively? Restricting nonprofit use of fundraising tools (through expensive proprietary software licenses) limits the number of people nonprofits can engage, the volume of donations nonprofits will receive, and ultimately, the universe of people nonprofits can help.\"

I couldn\'t agree more.

Continue Reading

Intellectual Property, Part III

On 21 Jul, 2005 By mpm

First off, this post is in honor of the EFF Blogathon. Read all about it. I\'m hoping that by writing this series, people who haven\'t been aware of these issues become more aware, and understand the stakes involved.\

In the first part of this series, I talked about my perspectives on open source software, and it\'s position in the whole intellectual property debate of our times. The second post was about the evil of patents. This third post, I want to talk about the issues of intellectual property and creative work.

First, though, some background on my own \"vested interests\" (or, more honestly, lack thereof.) Although I am a published poet and author (that\'s a stretch - I\'ve written some articles that have been published in edited volumes, journals and a magazine few have heard of,) and I occasionally make music (that\'s a real stretch,) I\'ve never made a dime off of my creative work. I have made many dimes from my creation of software, but that was part one. So, for some, this does not make me one who should say much about copyright of creative work. But, I\'m going to plunge in anyway.

And, also, in internet parlance, IANAL (I Am Not A Lawyer). I am fascinated by the law, and read about the law, but I\'ve never been to law school. Copyright law is a pretty obtuse field, and I don\'t even begin to pretend I know it. What I\'m going to talk about, though, is broad brush issues. What does the current landscape in copyright law look like, what are the issues arising, and who benefits from the current system, and who could benefit from a system that is more open.

Right now, creators of creative works are protected by copyright for their lifetimes, plus 70 years. So many works will be protected for 100 years or more. (Most work that was created prior to 1923 is in the public domain.) Copyright means that no one can take the work, whole or in part, and reproduce it without the consent of the copyright holder. Also, no one can produce derivative works without the consent of the copyright holder. Copyright is granted automatically when a work becomes tangible - but it is up to the holder of copyright to enforce copyright. (Some links are at the end of the article.) There are exceptions, called \"Fair Use.\"

So what\'s up now? Why is this such a huge issue, and what\'s at stake? Very simply, technology. It all started with the Xerox machine, the audio cassette deck, and the VCR. These technologies, and the technologies that have followed (computers, scanners, DVD recorders, MP3 players, software for ripping CDs, etc.) have made copying creative works trivial. At this very moment, if I wanted to, I could find (and download) most of the music that has been recorded in the last 20 years or so, most of the movies, quite a number of books (text or audio), etc., and pay not a dime, except the cost of bandwidth and storage, which is minimal, compared to the cost of buying all of that content.

Of course, I\'d be infringing on the copyright of all of those copyright holders, which, for the most part, are record companies, publishers, and movie studios, all of whom have big bucks, and all of whom are extremely unhappy at the current state of affairs.

Now I\'m sure some of you reading this remember the brou-ha-ha about cassette tapes, and how that was going to spell the end of the music industry. Didn\'t happen, did it? And VCRs were going to spell the end of the movie industry. Not hardly.(In the end, they benefitted from it mightily.) Now, of course, Napster\'s progeny (currently named bittorrent) will spell the doom of both the music and movie industry. I have my doubts about that, too.

But they are very busy making their case that they will be done in by this technology, and, because they have all sorts of money on their side, they are getting heard in Washington (as well as Silicon Valley - more on DRM later.) The most important law that bears upon this is called the DMCA (or Digital Millenium Copyright Act.) This was a very broad law, that basically criminalizes production of technology that makes it easier to infringe on copyright, and increases the penalties for copyright infringement on the net.

There is a fascinating example of the new ways in which technology can be used to create and disseminate content in ways that, although violates present copyright laws, in fact hurts no one (and might be argued would help the owners of the copyright.)

In 2004, an artist named DJ Dangermouse created a mix of the Beatle\'s \"White Album\", and rapper Jay-Z\'s 2003 \"Black Album,\" to create an album called the \"Grey Album.\" It was only available as a bootleg, since he didn\'t get permission from anyone to do a mix. The result was amazingly creative, and critically acclaimed. Although DJ Dangermouse violated the copyrights of varied owners, it would really be difficult to argue that the resulting work would in any way damage those owners of copyright. EFF has a good review of the legal issues.

Now some people realize that the internet could provide a really great vehicle for disseminating creative work. (Wow, really? Took them a while.) Enter ITMS (iTunes Music Store) and the reborn Napster. These two sites have different models (ITMS you download a file, and are free to do certain limited amount of things with it, Napster is a subscription service. Let the subscription lapse, and your music library goes silent.) What they have in common, though, is called DRM, or Digital Rights Management. One of these days, I\'ll actually spend some time to write about DRM, because the concept and technology are interesting, and there are ongoing arguments as to whether it could work at all. But what the folks who are allowing you to download content for a cost are doing is hobbling that content in various ways to control your access to it.

This is a lot like commercial software. You don\'t really own it, and you are told specifically how you can use it. Who does this benefit? Like software, it\'s basically the big, powerful people who already have lots and lots of money. Most musicians and authors, like software developers, make a living (actually, many don\'t), and that\'s about it. Some do better, many never make a living.

So, as the content industry (record companies, movie studios and publishers) move in the direction of disseminating content electronically, but in ways that strictly control how you can enjoy it, there is another movement, that is a combination of people who\'ve learned lessons from open source software, musicians that have always been friendly to music copying, and content creators who would like to more directly be in control of their own creative content.

This movement is made up of small, independent music distributors that allow you to download, sample and/or buy MP3s (sans DRM) online and creative authors licensing their work with open source-like licenses, allowing you to create derivative works from their own work. Let me talk about why I think this model, rather than the model being fostered by the RIAA and MPAA works in everyone\'s interest (er, except for people who want to make oodles of money off of other people\'s work.)

1) Creators of content can choose how and when to disseminate their creative work - they can choose someone to help them promote it, or not.

2) As happens often, word-of-mouth and freely released copies of content and derivative works actually increase the interest (therefore revenue generated) in the creative work. Here\'s a new example. And here\'s another, from a sci-fi author.

3) Consumers of creative work get maximal control of how they can use the content they have obtained (either for a fee, or for free).

4) Authors of creative work can be inspired to create new work based on the work of others, taking it in directions that are unpredictable, and potentially very interesting (like the Grey Album, except in this scenario, DJ Dangermouse won\'t get ceast-and-desist letters.)

I think this is a much better model than the model that only lets you play music you bought on pre-approved devices, only read a book you bought on your desktop computer, and not also on your laptop and palmtop, and continue to pay a fee for music, and if you stop paying, you lose the music. I can\'t believe that most consumers will put up with this for a long time (although I have been amazed before about what consumers will put up with when it comes to technology.) It will be interesting to see how things finally shake out.\

Links:

- basic copyright FAQ\ - wikipedia entry on copyright\ - EFF on DMCA\ - ALA\'s guide to the DMCA\ - Creative Commons

Blog-a-thon tag: EFF15

EFF15{width="105" height="40"}

Continue Reading

The Problem that Won\'t Go Away

On 21 Jul, 2005 By mpm

I hate spam. I always have. But lately, I don\'t have to deal with much, which is true for most people. Between server-side bayesian filtering, and client-side filtering, only two or three spam messages gets into my actual inbox everyday. Very nice.

But now, it appears that spam is making it\'s horrible way to the web. The first I heard of this was the blogs (on blogger, mostly) designed only to be created to manipulate search engines.

Now, there is a trend in domain name hosting. People will register a domain, test it for traffic, and use it only to deliver Google or Yahoo ads. If that domain can generate one or two dollars more than the cost of the registration, then they keep the domain. If you do this for thousands of domains, this can generate thousands of dollars in revenue.

I imagine that, like bayesian spam filtering, tools like del.icio.us and other collaborative bookmarking tools will mean that people will come across those blogs and domains less and less (and thus not make them cost-effective,) but in the meantime, we have spam to deal with.

Continue Reading

What is the bleeding edge doing?

On 18 Jul, 2005 By mpm With 4 Comments

As I\'ve stepped out of the direct role of being a nonprofit technology consultant, I\'ve realized that it is giving me a chance to see things from a bit of a different perspective. I\'ve been faithfully following a number of recent discussions on nonprofit blogging, social bookmarkingnew and exciting tools, and the like. I\'m having fun reading all the great posts on the blogs of friends and colleagues whose opinions I value highly (like Beth Kanter, Deborah Finn, Jon Stahl, Katrin Verclas, Art McGee, David Geilhufe, Marnie Webb and others) that I didn\'t have time to read before. This is all very cool, and makes the geek part of me happy. I\'m beginning to wonder, though, what is the role of the \"über\" consultants, as I\'ll name them (us? do I qualify?)

I spent a lot of my time as a technology consultant helping nonprofits see the value of open source software. For the first few years I was doing this, I would use the words \"open source\" and I\'d see this glazed, distant look in their eyes. They had no idea what it meant, why it was important, and how it could help them. For the most part, my clients were doing really, really well when I could get them to remember to test their backups, run virus protection, and troubleshoot why the printer doesn\'t work.

I can guarantee you that if you said the words \"nonprofit blogging,\" \"RSS\" or \"social bookmarking\" to your average E.D., or even CIO/CFO you\'d see that same glazed distant look. I spent a bit of time recently helping my congregation migrate their website (since I, the major webmaster, was leaving to go to seminary), and they were grappling with issues that we\'d been hashing out oh, 3-4 years ago, when we first started talking with organizations about CMS vs HTML.

Does this mean we should stop talking about all of those cool new things happening in webland? No, not at all! There is a lot for all of us to learn with these new tools and ideas, and adding them to the nonprofit technology toolkit is a great idea. And disseminating those ideas to people who are in a position to use them is important. But I worry sometimes that we (I include myself in this, for sure) are acting a bit too much like the hare, and not enough like a turtle booster. \"Slow and steady wins the race.\" Nonprofits still struggle with data management issues, the sector still struggles from lack of standards, there is still an amazing lack of inexpensive, good, solid software for nonprofit mission-critical tasks.

Continue Reading

On that Mac/Windows subject...

On 18 Jul, 2005 By mpm

This is incredibly cool. I don\'t know if it\'s true, but I like it. A lot. If true: get an intel Mac, run Mac software, Windows software and UNIX software (via X windows) too. Wowie Zowie! Not only a geeks dream, but a very nice solution to all sorts of problems.

Via digg.

Continue Reading

A \"rational response\"?

On 17 Jul, 2005 By mpm With 5 Comments

Today, in the New York Times, there is an article (reg. required), that talks about how people are throwing away their old PCs, in an effort to rid themselves of spyware, viruses and the like. \"throwing out a computer \'is a rational response,\'said Lee Rainie, director of the Pew Internet and American Life Project...\"

A rational response?? Rational would be to wipe Windows and replace Linux. Rational might be, like the woman pictured, throw away the box, and get a Mac (or, for the brave, you can wipe windows and put Mac OS on the same box). I can\'t quite believe we have gotten to the point where people are not only willing to put up with the viruses and spyware, but are willing to go out and buy another Windows machine! Microsoft, who makes a decent amount on each purchase of a Windows machine, must be jumping for joy. They are entering the anti-spyware business, and, of course, they will \"not [be] providing protection for people who have earlier versions of the company\'s operating system.\"

First off, MS entering the anti-spyware business is kind of like a vendor who sold you locks that are incredibly easy to pick, coming in and saying they\'ll clean up after the mess created by a break in (for a fee, of course). And, if the mess is too much, they\'ll sell you more locks!

How long are people going to put up with this nonsense? And how long are nonprofits, where every dollar spent on new PCs means a dollar that doesn\'t feed someone, or give someone services, or pays for medicine, or an activist on the hill, going to put up with this?

Continue Reading

H2O Playlist

On 14 Jul, 2005 By mpm

This is a very cool tool, discovered via my buddy Deborah Finn.\ \ H2O playlist is basically a way to share lists of content about a specific topic. It\'s a great way to create syllabi and other teaching resources that others can share. I can think of a number of ways this would be useful. In some ways, why not just create your syllabus on H2O playlist, and use that as a way to share it with your students? Once lots of things start to get characterized (I was amazed to learn that, for example, the Tao Te Ching was already on quite a number of playlists.)

And, the ability to make RSS feeds out of playlists, new playlists, etc. is fascinating.  And, like wikipedia, the more people that use it, the more useful it is for everyone - so I beg to differ with Deborah - use it! (I\'m creating a playlist now on the progressive religious movement.)

The one odd thing - just about all of the playlists that I\'ve seen so far either have 5 \"bulbs\" of influence, or none. So I think the algorithm for figuring out influence needs some tweaking. :-)

Continue Reading

Another reason to hate Microsoft

On 14 Jul, 2005 By mpm

Why on earth did they pick the most obnoxious, polluting, gas guzzling, view obstructing vehicle possible to promote their new version of Windows Automotive? I guess it\'s in character.

8423210331920622jpg\ Via Engaget, of course.

Continue Reading

Grokster

On 27 Jun, 2005 By mpm

The supreme court handed down a unanimous ruling in Grokster v. MGM. I\'ll be talking more about it later, once I read and digest everything (I\'ll make it part of the IP Part III post I\'ve been promising.) But beforehand, here\'s BoingBoing\'s post on varied coverage.

Continue Reading

Social Source Software

On 26 Jun, 2005 By mpm

My colleague David Geilhufe, with whom I\'ve spent many an hour discussing open source software issues, just posted on his blog, Social Source Software a post where he begins to lay out his vision for the open source ecosystem in the nonprofit sector.

I especially like the different perspectives on effective and affordable software from the closed-source vendor vs. open-source vendor perspective, and I look forward to hearing more.

Continue Reading

del.icio.us direc.tor

On 25 Jun, 2005 By mpm

I came across an amazing tool connected to the social bookmarking tool del.icio.us. It\'s called del.icio.us direc.tor. It is rather amazing. Here\'s a screenshot of what it looks like for me:

Picture_1_1{width="300" height="154"}Basically, it takes your tags, and organizes the bookmarks according to their tags. This is very, very cool.

There are details on how this works (via XML, XSLT and Javascript). It only works in Firefox and IE, it doesn\'t work in Safari (here is what the author of this tool has to say about Safari.)

It makes me want to bookmark and tag just about everything I come across, though!

Continue Reading

{.post-icon .standard}

Intellectual Property Issues, Part II

On 24 Jun, 2005 By mpm

OK, first, I lied. I\'m going to talk about patents before I talk about copyright and copyleft of creative content.

As I said in part I of this series on intellectual property, I am not a purist about open source software, and you\'ll find out in part III, I\'m not a purist about copyright and copyleft of creative content, either. However, when it comes to patents, I am a purist. In my opinion, patents are wrong, and even more so in their current implementations. Patents and patent law is complicated, and many people think that it\'s not relevant to their lives. Nothing could be farther from the truth.\

Patents on things like business methods and software methodology are at the very least annyoing, and limit creativity and innovation. Patents on drugs, living things and food crops are, plain and simply, evil. And as far as I am concerned, the baby needs to be thrown out with the bathwater.

What do I mean by evil? Well, one of these days, on my ministry blog, I\'ll have a post on what I think about evil - it\'s outside of the scope here. Suffice it to say that patents on those things create situations where wealthy people and corporations hold, in the palm of their hands, the lives and well beings of the most vulnerable beings on our planet, and, in fact, profit from suffering. I can\'t possibly think of another word to describe that besides evil.

Patents have an interesting history, one that I have only a passing familiarity with. You can find out a lot more in the list of links at the bottom of this article. Patents are supposed to provide protection for the inventors of a product, method or process, so that these cannot be used by others without being compensated. Now at first, this sounds pretty reasonable. And, in fact, for a long while, it worked fairly well, and protected inventors from being taken advantage of by big companies that could manufacture a product and bring it to market in a way than an individual inventor could not.

But something happened on the way to the forum. Individual inventors went the way of the dodo, technology changed the face of products and processes, corporations started filing patents on things they hadn\'t actually invented, and the patent office went off the deep end (or, rather, caved in to corporate pressure.)

Let me first start with software and business method patents - things I know a little bit about, and patents that are more annoying than evil. As a generic web developer, I am in violation of several patents, just by doing what I do. It\'s a good thing that the owners of these patents haven\'t yet decided to go after the likes of me, and that I don\'t worry too much about it. If I did worry about it, or if they did come after us web developers, it would absolutely stifle creativity and innovation in software and the internet. I certainly couldn\'t afford to pay the fees they would want me to pay.

Let me give one example, frames. This is pretty old news. SBC communications owns the patent on frames, which is a very common method used in websites. In fact, it\'s quite likely you visited a site recently that used frames. They\'ve tried to enforce their patent, and demand fees from web site owners. Now frames is a standard HTML element, that has been used for years (it was a Netscape specific extension to HTML 2.0, and became part of the HTML 3.2 standard - which was released in 1997.) I don\'t know what has finally happened to this, but it is a clear example of a problematic patent. Another example is that Jeff Bezos, founder of Amazon.com, and owner of the famous \"one click shopping\" patent, also now owns a patent on \"Information exchange between users of different web pages.\" This means that, if he wanted to, he could ask for fees for websites (like this one) that use the trackback technology. Ouch.

Luckily, some in the nonprofit sector are getting hip to the issues involved in software and business method patents, and see how they could be detrimental to the sector. Yay for NIA.

It\'s pretty simple to see how software and business method patents stifle innovation, communication, and ultimately don\'t benefit the common good. No, not evil, but certainly problematic.

Let\'s talk about the really bad patents. First, drugs. Drug patents are designed to allow drug companies time to recoup their R&D costs, by giving them exclusive right to the drug for a certain number of years. Right now, drug companies have 20 years from the invention of a drug, to exclusive use of that drug, before generic companies can begin manufacturing that drug.

I could go on for days about the fallacies behind drug company R&D (they spend more on marketing than they do on R&D, and a lot of drug R&D is funded by the government and private foundations.) But what happens is that drug companies get to set any price for a drug during the time that they have exclusive rights to manufacture it. This is why, for example, South Africa had been violating the drug patents for AIDS drugs, so that it could get drugs to people who need them (luckily, the drug companies relented in their fight.)

Amazingly enough, drug companies and others are fighting to extend that time. Given the current realities of our health care system, the more years that a life-saving or quality-of-life-improving drug can be exclusively marketed by a drug company, means the more people who will be choosing between living (or feeling better) and eating or paying rent. And this means that the drug companies are profiting from their suffering. Now, you might think, the way they talk about this, that drug companies are hurting. Nothing could be further from the truth. The drug industry is the most profitable industry there is.

Now, on to patents of living things. Ever heard of the OncoMouse? It is a mouse that was developed to study cancer, and the first animal to be patented.  (The first living organism to be patented was a bacterium.) There are a number of problems with this patent. First, it will stifle research on cancer, since the owners of the mouse can charge fees for research using it. Second, is the basic bioethical issue of the ownership of a type of living animal, however it might have been developed. Third is the animal rights arguments - these mice live to get cancer and die. (I feel torn about the animal rights arguments, but I wanted to raise them.) These all are good reasons to reject patents on research organisms.

An extension of the precident for patenting organisms, is the process of patenting genetically engineered food crops. These crops have been genetically modified for certain characteristics (depending on the type of food.) The genetic engineering is problematic in the first place (as you can read in that link above,) and the addition of patents takes control of the use of food crops out of the hands of farmers, and into the hands of large corporations. In third world countries, this risks disaster. All to make money for people who don\'t need it.

As someone who is deeply concerned about the ways we treat each other on this little spinning ball we\'re sitting on, the current state of patents, and the kinds of negative affects it has on people, suggests to me at least a major overhaul, if not a wholesale abandonment of the patent system. I wished that more organizations, whether secular or religious, would understand and embrace these issues, as complicated and esoteric as they may seem.

One thing I didn\'t say about open source software in part I, is that software has become an essential part of the matrix in which we live our lives and connections with each other. Who controls that software has an effect on that matrix. We all eat, many of us take drugs, and benefit from scientific research. Who controls those things has an effect on our lives, and it\'s time we understood this.

Links:

Continue Reading

Intellectual Propery Issues Part I

On 24 Jun, 2005 By mpm With 2 Comments

One thing I haven\'t yet written much about in my blog is intellectual property issues. I do, in fact care deeply, and think frequently about them.

Part of my interest in intellectual property issues (the whole range, from patents, to copyright) stems from my interest and fascination with the law. Part of it comes from my interest and knowledge about technology, and how technology has, for many reasons, changed the equations about intellectual property, and made the whole set of issues quite a bit more complex than they used to be. And lastly (but far from least), my interest in intellectual property issues comes from my interest in the common benefit, collective creativity, and common good of all human beings.

In this post, I\'ll start with open source software, it\'s position in the intellectual property realm, and why I think supporting and using open source software is so important. Part II will focus on copyright and copyleft, and new ways of distributing creative content (writing, music, art, video, etc.) and the benefits I think we\'ll gain from it. Part III will focus on patents, and why I think they are evil (really, I do.)

Many of you know what open source software is already. For those who don\'t there is a great wikipedia article on it (and, why not, I wrote a primer on it for the nonprofit sector.) What open source software introduced into the world is the idea that you can make software better by having it open to scrutiny, collaboratively developed, and freely accessible. What it has also showed, over time, is that it can be a more economical way for people to take advantage of technology.

Microsoft, and other traditional software makers, make money from packaging and distributing their software in a form that is unreadable, unscrutinizable, and costly. They have elaborate EULAs (end user license agreements) which limit the ways that end users can use their software (you mean you thought you owned that copy of Microsoft Office? Sorry to tell you, you don\'t.) The money traditional software makers make on their products goes to two places: their investors, and back into more development (to make more money.)

Open source software makers (everyone from IBM and Red Hat, publicly traded corporations, to individual software developers) primarily make their money by selling services and consulting that goes with the software they give away. You can always get the software free, but if you need or want support, it\'s available at a fee.  Some (especially Linux distributions) do actually sell their software packaged (but free versions are available). Some open source software is developed by non-profit organizations (the Mozilla Foundation, for example,) which get some funding from fundraising and grants.

There is plenty of evidence, now that open source software has been around for a while, that it is possible to make money at developing open source software. I doubt, honestly, that a new Microsoft will emerge from the open source field. But there are plenty of mortgages are being paid, families being fed, and retirements being funded.

And what happens to the code? In the case of traditional software vendors, the code stays secret, stays with them, and no one except them (or select partners) ever sees it. In the case of open source software, it stays in public view, downloadable, and modifyable by anyone with the expertise and interest.

I think it\'s pretty obvious why this is more in the common good. Third world countries are getting on the open source bandwagon big time. Why? Because they can implement open source cheaply, using less expensive hardware. I\'ve been saying that nonprofit organizations should do the same, much for the same reasons, as well as the additional reason that they support a system of software development that benefits the common good, which, for most nonprofits, is what they exist to do.

In some senses, the whole set of intellectual property issues right at this moment comes down to that one question: who will benefit? As we heap more and more dollars on people who already have more than enough, we are also putting in place more and more structures (patents, copyright extensions, regulations like DMCA) that keep the copyright holders in control (who, by and large are large and wealthy corporations) and then can limit access and use of the software available and keep prices high.

Lest you think I\'m a purist - I\'m actually not. I don\'t think all closed-source software is evil, and I think that there is no reason that there can\'t be a healthy mix of open source and closed-source software in the marketplace. Heck, this blog was written on a Macintosh (which, in some ways is an interesting mix - an open source core of BSD-Unix, with a closed-source GUI, and I\'m using the open-source browser Firefox.) But many traditional software makers (unlike Apple, which seems to mostly feel fine coexisting with it) see open source software as a threat, and would like it to go away.

What\'s most important, is that I do think that we should, as individuals and organizations, think carefully and make careful decisions about the software we use - right now, it has more impact than we might think.\

Continue Reading

Remember when 1 MB was a lot?

On 23 Jun, 2005 By mpm With 19 Comments

OK, so I\'m feeling old today. I just came across a post on slashdot, talking about a 1.5 petabyte system. A petabyte is 1,000 terabytes. What\'s a terabyte, you ask? That is 1,000 Gigabytes. I have about .5 Terabytes worth of storage attached to my computer (500 Gigabytes), and that\'s a lot. Most people are happy with 40 Gigabyte hard drives.

I remember, somewhat fondly, the old PDP-11 70 that I worked with in graduate school, back in the early \'80s. It had a 10 MB hard drive, that required two people to lift and put into the drive bay. I have no idea how expensive it was, but I imagine it cost thousands of dollars. Now, a flash drive with 10+ times that will sit lightly on your neck, and lighten your wallet by a mere \$20.

And there are 1 TB hard drives (that\'s 100,000 times that old 10 MB drive) that you can now take away for a mere \$900, and will sit on your desk.

No wonder people keep talking about how people will stop deleting things. With tools like spotlight, or google desktop search, you can find anything, at any time. I have files that I\'ve carried from my first computer in 1987, that are on my hard drive now. I won\'t be surprised if I still have those files on my new computer with a 1 PB hard drive that I buy in 20 years time.

Continue Reading

Interesting Weird Technology Blog

On 23 Jun, 2005 By mpm

I came across this site called we-make-money-not-art today, thanks to BoingBoing (as usual, the some of the most interesting technology stuff gets posted there first). It\'s chock-full of interesting tidbits (like the prayer rug that gets lighter the better it\'s pointed toward mecca.)

What really caught my eye was a cube that plays video. It\'s a new project (just in prototype phase) that sounds really interesting - make a cube you can easily carry around, with six faces of video on it. Strange, but interesting.

Might be a neat site to watch.\

Continue Reading

Welcome to my technology desk

On 23 Jun, 2005 By mpm

I decided it made a lot of sense to divide my blog space, and create a brand-new technology blog. I\'ll focus on nonprofit technology issues, open source software, gadgets, useful online tools, and the like.

I imagine my posts both here and on my main blog will be a little less frequent, since I\'m splitting my time. But it seemed to make the most sense. So if you are interested in my view/approach/ideas about technology, this is where to look.

Although I\'m going to seminary, technology is, and has been for almost 30 years, in my blood. I can\'t help it, I think about technology, read about technology, live technology. So here you can read about technology. If you have any ideas or suggestions, please feel free to drop me a line.

Continue Reading

Creative Commons
License\ This work is licensed under a Creative Commons Attribution-ShareAlike 3.0 Unported License.

links

social