Excel to MySQL service

About 10 years ago, I’d built an ‘excel to mysql’ service. I let the domain name go, and had run in to programmatic issues. 10 years later, it’s still a problem that I need to address now and then, so I’ve revived a basic version. Feel free to have a poke around, and if there’s some interest, I may build this out more.

Excel to MySQL link

There’s loads of stuff it doesn’t do yet, but if the demand is there, I’ll make it do more. 🙂

Feedback appreciated.

New dev habits

In the last 2-3 years I’ve started to kick off more projects with Laravel 5 vs other toolkits. There’s been a variety of reasons for this, which I’ll likely blog about in the next few months. For starters, it was mostly just as an experiment to get some experience and compare/contrast with other experiences. There are still bits I don’t care for (as with any tool) but there’s a couple of bundled conveniences which have influenced how I build code.

The ‘tinker’ REPL provides a good balance between quick ‘lemme check this code’ and keeping all the framework bootstrapping and autoloading stuff in place (vs just writing a file and running it via php cli). I’ve built CLI bootstrap processes in the past which accomplish some of this, and having this already bundled with the toolkit provides more convenience.

Relatedly, the ‘console’ command structure and support being bundled has had an impact. And when these tools already bundled and there’s little configuration/setup I need to do, I tend to use these more.

FWIW, this isn’t meant as a “pro-Laravel” post – I use it and like it, but also realize this is bundling tools and concepts that already existed (IIRC the Laravel console stuff is primarily built on the the Symfony console code, which, again IIRC, I think is using PsySH as its base?)

So… how have these helped? They’ve made it easier for me to think in terms of smaller, reusable code blocks. By knowing I have a documented and fairly easy way of using code X both from the CLI and from web request, I find myself thinking and coding with that reuse in mind from day one, vs looking at existing code and thinking about how to refactor it to be reusable.

That said, phpunit has always had that effect, but I don’t often find myself thinking in terms of test stuff at the beginning of a project. That’s where tinker tends to come in.Being able to jump to a command line and exercise some small bit of code first, before the ceremony of tests and associated setup, has been a big help for how I work. Side note: I’ve recently just switched to ‘tinx‘ to allow for faster context reloading when inside tinker.

Between moving to PHPStorm years ago, and tools liker tinker, phpunit and psysh, I’ve felt more productive and able to provide more maintainable code. Even though I still have a ways to go on the discipline of sticking to things (never feel like there’s enough tests on projects, for example), having this growing ecosystem of PHP tooling and community has had a big impact.

PHP now vs 2009

I’ll start by saying these aren’t quite ‘new’, but I’ve not written in a while, so thought I’d throw a quick update.

In the past 3-4 years, I’ve re-engaged with PHP. I started with PHP in 1996, and while I’ve never *not* done PHP, I spent several years in the Java world. I started some Java projects in 2009/2010, and the PHP landscape looked pretty different back then. Looking at PHP 5.2, maybe 5.3, the speed was… meh, tooling was always getting better, but not great, and there felt like a lull while the community decided on what PHP’s direction and future might be.

During that time, many other tech/frameworks were gaining momentum – nodejs certainly, and Rails was maturing and gaining some traction as well.

A few things changed in the intervening years – PHP7 and composer. These two really changed a lot for me, and for many others, and I started more projects in PHP than in other tech stacks, and I’m glad I did.

I don’t expect PHP to rival C or Java for raw speed any time soon, but for regular web app stuff, the speed has been improved a lot from where it was in 2009. Comparative benchmarks show PHP7 2-3 times the speed of PHP 5.2/5.3, and PHP7.3 is, from what I’m seeing, roughly 4x (and in some cases 5x) faster (at regular benchmark tasks) than where we were 10 years ago.

Speed is useful, and reduced memory consumption is great, but the composer universe, even with some flaws here and there (I may write about some niggling bits in the future), really transformed how most professional developers use PHP. There was a recent PEAR compromise, and I was … amused? not surprised really – to read so many comments from folks asking “what’s PEAR?”. Maybe one day they’ll ask “what are globals?” or “what’s mysql_query”? 🙂

I’ll be hitting the Sunshine PHP conference in Miami Feb 8 – just a couple weeks from now. If you’re going to be there, give me a holler and we’ll grab dinner or a drink and wax PHP, development and everything else 🙂

EducatorSurveys.com – teaching working conditions surveys

Years ago (and up through 2018) I did work for the New Teacher Center in California working with some talented folks there to develop and manage a survey system for administering and reporting on “teacher working conditions”. It was a new avenue for me, and I learned a lot about the education space over the years on that project.

Externally, the project was known as “TELL” – Teaching, Empowering, Leading and Learning. The survey administration and reporting served as the basis for schools to identify issues and develop improvement plans. By periodically remeasuring the school teacher climate (“climate surveys” was another phrase I learned then!), schools, districts and even states could have a better understanding of which changes were leading to improvements.

In 2018, NTC decided to stop providing the teacher working conditions project. There was a change of focus at NTC, and they’d decided the TELL project wasn’t something that fit their new focus, so no new working conditions surveys were scheduled.

Bit of background – the majority of surveys we did were for entire states (Kentucky, North Carolina, Tennessee, Delaware, Ohio, etc). As you can imagine, it’s a moderately large effort (and expense) to coordinate something like this for a whole state. Over the years, we’d had requests from smaller entities (school districts, mostly) to use the service, but the original process and survey hadn’t been designed to ‘scale down’ too quickly to easily support smaller organizations.

Taking that idea, a former colleague from the project and I put together a somewhat scaled back version of the earlier system. The new system can be found at EducatorSurveys.com. We’re in a testing phase right now, and are in discussion with a couple of districts to pilot the new system soon.

If you’re interested in learning more, please reach out to me at michael@educatorsurveys.com – or 919-827-4724. I’d be happy to talk to you about how working with climate surveys can help your school or district.

LockDown WordPress Plugin

I’ve been a small-time user of WordPress for… probably 12 years or so.  I’ve had friends and colleagues go “full-on” in to the WordPress world, but it’s never quite felt like my cup of tea.   A bit too much magic, or hard-to-determine steps, etc.  Not horrible, but not my favorite tool.  That said, for blogging directly, it’s great.  

However, I’ve had friend and clients that use it as the basis for everything.  And… they get hacked.  Often.  I see a dozen or more ‘hacked’ WordPress installs a year, and by ‘see’ I mean someone’s asked me to help take a look and fix it.  *ALMOST* always, the final culprit was a malicious file was written to the server’s drive, or an existing file was overwritten with malicious code, and the WordPress install was spewing out bad stuff (SEO spam, redirects to porn/gambling, web shells to trigger spam emails, etc).

There have been numerous attempts to prevent these attacks, or, in some cases, to make “cleanup” easier.  But in no case I can think of does any service focus on locking down the actual files first, making them unwritable.  This is, in my view, the biggest bang for the buck – prevent writing in the first place, and the majority of exploits will be averted.  You may still have other exploits (XSS, SQL injection, etc) – of course.  But stopping people writing to your drive *before* it happens should be a bigger priority, and it seems to be overlooked.

Welcome LockDownWP   This is a plugin I’ve put together to help make this easier for people who aren’t server admins.  The focus is pretty simple – make all files unwriteable, until you need to write to them.  Press a button, make your files writeable for, say, 10 minutes, get your file changes done, then lock things down again.

Is this foolproof? Of course not – you’ll still need to keep the system up to date to prevent other exploits, but I can say that since I’ve been using this technique, I’ve not had any malicious code injected on any servers I manage (and, before I took this approach, it was a … not-uncommon thing I was dealing with).

There’s more on the way for this – it’s still a beta version at this point, but I’d love any feedback you have on the plugin, or ideas you’d like to see.

Take a look at LockDownWP

Small new project – mutual NDAs

I meet a fair number of people in the tech world, and many people want to discuss ideas, but often want NDAs…. mutualndas.com was put together. This gives you the ability to send over a mutual NDA with a definable length of enforcement, definable jurisdiction, and allows for basic electronic signature.

Haven’t blogged in a long time…. 🙂

Have had multiple ideas for projects in the back of my mind for years, and rarely get the chance to work on them (or force myself to when I have the time).

As with many ideas, you struggle to even explain them to someone, or you find there’s dozens of options already out there.  Yet we push on…

I meet a fair number of people in the tech world, and many people want to discuss ideas, but often want NDAs.  I don’t often sign them, often because they’re one-sided, or very more one-sided, or even extremely one-sided.  And long.  5 years after we stop talking, or ‘in perpetuity’.  I’ve wanted something a bit more lightweight, and fast, to act as a sign of good-faith while not tying either party to something which will present problems later.  Lastly, I wanted a place to keep track of these sorts of docs (and eventually others) – an NDA has date/timelimits on it, and I often forget where mine are.

So… mutualndas.com was put together.  This gives you the ability to send over a mutual NDA with a definable length of enforcement, definable jurisdiction, and allows for basic electronic signature.  I can send out an NDA, and the other person can sign it (on desktop or mobile device) and send it back, and we each get a PDF emailed to us, all in about 2 minutes (longer if the other party wants to read the whole thing, of course!)

There’s precisely one NDA – a boilerplate – which I’m going to change a bit (and have an attorney give me some input based on those changes), and eventually offer a few options on this.  Again, this is mostly for myself, and lots of small things that could be added, but I’m putting it out there now just to see what sort of feedback I get.  Would like to charge for this service at some point (unlimited docs, alerts before “end of first year”, etc), and potentially some area to document specific “confidential” items.  Often these sorts of NDAs indicate that you have to keep “confidential” info confidential, but the  term can be nebulous.  In a mutual NDA, if you were to document certain items you consider confidential (up front or during the course of the engagement) it would serve as a third party record.

In any event, give it a shot and tell me what you think 🙂

Importance of backups…

Well… here we are.  10 years later, and … no backups.  Or… none of the data that’s important.

Recently had a drive crash in my main server where this blog is hosted.  Had it happen 2 years ago, but the data was recovered, and I put everything on automatic backups.  Using virtualmin, a great control panel, I had it automatically back things up to s3 and to a second local drive.

HOWEVER… I got lazy.  I made some databases by hand, instead of using the virtualmin tools (either CLI tools or web screen) and the blog database had been disassociated with the main domain account, and it wasn’t backed up.

I’ve just lost 10 years of blog posts, comments, etc.  I’ve asked for the drive, if it’s still around, to be shipped to me, and I may try to recover data (a few other bits would be really useful to have as well), but I’ve had to come to terms with the notion that it just may be gone.  🙁

This may energize me to post here more, however.  I’ve gotten a bit lazy and engaged people more on facebook the last year or so, vs here, and I’ve missed posting here, so… I may be back in more volume soon!