The Finished Porch

posted on 2004-10-11 at 22:41:46 by Joel Ross

Earlier this year, I mentioned that we added on to our house. Well, the pictures are finally up. I'll only show the before and after, but you can see more pictures here.

Here's what the house looked like before:

Here's the final result:

It's been done for a while now, but we just got the pictures uploaded yesterday. Here's what we did: First, we removed the big picture window in the front of the house. We replaced it with two windows that open, and are much smaller, and added a wall between them. Then we added a porch across the front of the house, which is about 20 feet by 7 feet. If you notice in the original picture, the old porch was a small square of cement. That's still there, but we nailed the wooden porch right to the old cement porch. The tool to do that was basically a gun, and had quite a kick on it.

After the porch, we added the roof line, as well as the supporting beams. Ever tried to hold a 20 foot head beam above your head while someone else nailed it in place? Not fun. After we did that, it was noon on the first day, so we decided to each lunch!

Actually, that was Memorial day weekend. Throughout the summer, we put up the rest of the siding, shingled the roof, added shudders, put in some porch lights, added a ceiling, moved one side of the landscaping, and added a little landscaping on the other side. The project is pretty much done - the windows still need painting (we got lazy), but that can be done anytime.

Anyway, this is my first major house project (not the last, that's for sure), and, while I didn't do most of the planning for it (thanks to my father-in-law for that), I still put in a lot of work, and am pretty proud of how it turned out.

Categories: Personal


 

I'm Thinking of Starting A New Pet Project

posted on 2004-10-10 at 22:46:17 by Joel Ross

As many of you can probably tell, one of my the things I enjoy doing is to pick NFL games every week. Right now, I track my picks in an excel spreadsheet. Not exactly ideal, but it works. And it has for the past two years.

But as I start to look at history, it gets to be a pain to manually look back over games. For example, last week, I looked at all of the games I tracked that had a spread of 10 or more points to determine how the favored team did. It was a manual process, and very error prone. I got a number, but I'm not positive it was right. As I go forward, and I hope to, it's only going to get worse.

If only there was a storage medium that was easy to query!

So my new pet project is to write a system to track my NFL picks, and run at least a few simple queries. So here's part one. My database design. At least now, I can put my picks in the database, and build the application around it. It's a simple database design thus far. 5 tables.

Here's the design:

Like I said, it's nothing spectacular. It's simple, but effective. It allows me to track everything that I track now.

My goals with this are still in limbo. The thought just kind of popped into my head today, so we'll see where it goes. Here's some thoughts though. Sorry for the randomness. I've intermixed technical with functional requirements.

  • Use a O/R mapper for database access
  • Unit testing completely throughout the process, including on the database
  • The ability for multiple users to enter picks, and be able to be compared to eachother.
  • Reports

I haven't totally thought it through yet, but with this agile thing working at work, why not apply it at home too? Eventually, things will solidify, and I'll post about decisions. My goal is to make my design and development process open. This may be a slow process, as I'm very busy, but I'll give it a try.

My last pet project died a quick death, mainly because I wasn't attached to it. This one, well, I'm slightly more attached to, so hopefully that will make the difference. Only time will tell, though, I guess.

UPDATE: I added a SeasonName to the Seasons table (thanks, Steve), and changed the datatype on the SeasonId column. Also, I added data types to the image.

Categories: Personal


 

Post 100: A Look Back

posted on 2004-10-10 at 00:18:43 by Joel Ross

It only took me six months, but I'm finally at post 100. At this blog. I've had quite a few blogs in my lifetime. I've used an XML based blog, I've used blogspot, I have a (semi) active GeekDojo blog, and I have one through work.

But this one's the only one that gets everything I post. And that I feel pretty comfortable putting anything on. Sure, others may get content sooner (my work blog had the gmail folder thing three days earlier), but it eventually makes it here.

At first, my goal was to build a following, so I was cautious of what I posted. I had big plans. Lots of code, lots of sports, and lots of politics. But I toned the politics down, mainly because I was afraid of what my readers would think. What are they (you) looking for when you come to my site? I still don't know, but the hits keep going up (slowly, but steadily), so I must be doing something right.

I still don't post about politics much. Maybe it's because I don't feel eloquent enough to defend my position. I'm perfectly fine defending in person, but on paper, not so much. Or maybe because subcontiously, I don't want to offend anyone. Or provide something to someone that they aren't expecting to get. Let's get the last one out of the way right here: I may start posting about politics. And you may not like it. Or you may. If you're offended, then so be it. I won't apologize, because that's my opinion. If you're looking for something else from this site, and don't want politics, subscribe to a category feed, rather than the overall feed. The eloquent part? I can't fix that. I'll just do my best.

Blogs are an interesting thing. I can't explain why, but I have more posts this month than in any other month I've ever blogged. And it's only the 10th (10 minutes into it). And this week was the busiest work week I've had in months, yet I was still able to blog. Maybe it was seeing traffic increase. Maybe the moon is full. I have no explanation, but I do hope to keep it up. It's almost an addiction now!

It's also amazing to me how the blogosphere works. I recently saw a referrer come through from Steve's midlife crisis. That caught my attention. I've been getting a lot of referrer spam lately, but this one was different. One hit. Referrer spam usually comes in 10s and 20s. So I checked it out. Most of the recent posts are about the Middle East. But I'm on his blog list. I have no idea why, and I think there are others out there that link to me, but that one caught me off guard. You never know what you post that catches the eye of someone else.

Speaking of blog lists, I need to get mine up. And I'm working on a redesign of how the site looks. Given that I'm a horrible designer, it may be worse. I'm also trying to figure out what to do with Rosscode.com. No, the blog won't go away, but the root may change. Now that I have a bunch of extra space, I can do something with it. Some sort of portal? A wiki? A CMS site? Who knows. I guess we'll see if I can come up with anything.

Anyway, this blogging thing has been fun, and I hope to keep it up for another 100 posts. Hopefully the next 100 will come faster than the first 100.

Categories: Blogging


 

Getting Use Of Your GMail Account

posted on 2004-10-09 at 23:59:51 by Joel Ross

I haven't found a great use for Gmail yet. I get google alerts there, but that's about it. Why? Well, with my own domain, I can already have an easy email address. Basically, anything to rosscode.com finds it's way to my inbox. Why use Gmail?

That's what I was thinking when I found this. Now, I can use Gmail as a backup device. Install the software, and in My Computer, I now have folder for Gmail, where I can copy files directly to my gmail account. Pretty cool.

I saw this same type of thing on Linux, but since I'm not running Linux, that didn't help me much. Now I have my Windows solution!

Categories: General


 

Something I Didn't Know About Parameterized Queries

posted on 2004-10-09 at 23:46:42 by Joel Ross

We had the client's team lead in from California this past week, which is why my posting was supposed to be light last week (which it didn't turn out to be, did it?), and I learned something from him.

Well, a lot of things. He thinks he's the only one who learns anything when we get together for a week (this is our fourth week of side-by-side coding). That's definitely not the case, but I'm not here to talk about everything I learned.

I wanted to highlight just one thing. If you have a SqlParameter, the .ToString() method returns the ParameterName. That's it. Nothing earth shattering, but it was interesting to see his queries he was building. Here's an example:

arParms = new SqlParameter[2];

arParms[0] = new SqlParameter("@customerId", customerId);
arParms[1] = new SqlParameter("@status", status);

string sql = "update Customers set Status = " + arParms[1] + " where CustomerId = " + arParms[0];

When you look at the value of sql after it executes, you get:

update Customers set Status = @statusId where CustomerId = @customerId

Again, nothing earth shattering, but it does make parameter names easier to change! Gotta love customers!

Categories: ASP.NET


 

Database Cache Invalidation, In ASP.NET 1.x

posted on 2004-10-09 at 23:35:31 by Joel Ross

We've come up to a problem in my current project. We have two applicatons sharing a database. One will be a public facing website, and the other will be an internal administration website. They share a business layer, but not physically. The processing is data intensive, so we have a need to cache the data that needs to be retrieved.

The cache will be built in each application, but only updated in the admin application. So how do we expire the data in our web application? We came up with a few distinct options.

1. Use a database table, which holds the cache keys and the date they were last updated. When the cache is accessed, the application would check the date it cached the data agains the last updated date in the database, and know if it had been updated. This is (from what I remember) similar to how Whidbey's database cache invalidation would work with SQL Server 2000.

2. Have the web app expose a web service that the admin application could call and pass a cache key that would expire that cache item.

3. The web application would use a timer based expiration strategy. For more dynamic items, the time would be shorter. For items changed less, the timer would be longer.

4. Build a Cache Manager component that each application would use, and each website would remote to it to get it's data.

Now, the downside of each:

1. Overhead, both in writing the database access stuff, and the database hit for each cache access (what cache tries to eliminate). Yes, the hit is smaller, but it's still a hit.

2. This probably would work, but there are some items that cached that are updated directly in the database, and not through the admin web site, meaning the web service would have to be called manually. Too much human error.

3. Data would just about be gauranteed to be stale at some point. Not really acceptable.

4. This is a good solution, but we felt the overhead of writing this would be too much, and not easily maintained.

So we got to thinking, "Someone has to have written a solution to this problem!" On to Google. And a solution. It's an admited hack, but it should work. Basically, you set your cache to based on a file, using the FileSystemMonitor. Then, write a trigger on the cached tables that touch the file the cache is based off of.

Why is this better? Well, the cache invalidation takes place instantly, and the database hit doesn't happen on every cache access. Also, the FileSystemMonitor works on another thread, so the file access (which may be on a different service), which may be slow, won't be on the same thread as the current page.

We haven't implemented it yet, but that's the plan right now. If you have any reasons this won't work, or you have a better solution, let me know. We're always open for suggestions!

Categories: ASP.NET


 

Design View Is For Wussies

posted on 2004-10-08 at 23:50:56 by Joel Ross

If you've developed ASP.NET applications in Visual Studio.NET, you've probably experienced either of these issues. When you switch from Design view to HTML view, your HTML is rewritten, and not in a good way. And here's an issue that I've seen before too - using the Design view can cause previous event wire ups to disappear.

This isn't a post about the problems with design view. Those are well documented, and will supposedly be fixed in Whidbey. Will they be? I don't know. I don't really care. I would much rather deal with the HTML directly, and wire up control events manually, using the OnInit override. I'm not really sure the last time I've used design view. It just doesn't provide me with any value. The representation isn't good enough to know if the design of the page is the way it needs to be, and I'm not going to be moving things around by drag and drop.

I even told my current project team that design view is for wussies. Of course, I said the same thing about a mouse too!

In reality, I'm just messing around. Build your software however you want, but if you take one thing away from this, it's that you'll be better off if you wire up your events by hand, rather than relying on the designer to do it for you.

Categories: ASP.NET


 

Get Out Your Magnifying Glasses. Let's Find Some Milliseconds

posted on 2004-10-08 at 22:03:38 by Joel Ross

David Boschmans has three great posts about reviewing code (or, if you read it ahead of time, writing code) with an eye on performance. I've read through them, and learned quite a bit. I know I'll be going back to make some changes to the code I'm writing right now.

Anyway, there's three posts:

Part 1

Part 2

Part 3

One note from his comments. The first tip on page one, about StringBuilder, only applies if you have more than 5 concatenations.

Categories: ASP.NET


 

ASP.NET Authentication Vulnerability - Fix

posted on 2004-10-07 at 11:07:08 by Joel Ross

I haven't posted about this, because I haven't had time to read much about it lately, but I finally got the time to look at this, and the fix is pretty simple.

I didn't look at it yet because our project hasn't deployed yet, but I will need to incorporate this into Tourney Logic's software too!

Anyway, I'll give credit to Robert McLaws, since I saw his fix first.

For those who don't like clicking, here's the C# code you need. Just add it to your global.asax file.

<script language="C#" runat="server">
void Application_BeginRequest(object source, EventArgs e) {
  if (Request.Path.IndexOf('\\') >= 0 || System.IO.Path.GetFullPath(Request.PhysicalPath) != Request.PhysicalPath) {
    throw new HttpException(404, "not found");
  }
}
</script>

Categories: ASP.NET


 

ViewState Issue Solved

posted on 2004-10-06 at 23:41:11 by Joel Ross

I posted on my GeekDojo blog about problems saving ViewState with dynamic, composite controls.

Well, it looks like Scott Mitchell has a solution. Basically, add the control to the control hierarchy before you add items that need to be tracked by viewstate.

I haven't tested it, but if what he says is correct, this will take care of the problem.

Categories: ASP.NET


 

<< 1 ... 109 110 111 112 113 114 115 116 117 118 119 ... 124 >>