Funding for Shepherd's Renovation

Anonymous
Anonymous wrote:
Anonymous wrote:

Have you read the entire thread? Shepherd is getting zeroed out with no promise to restore. Again, this calculation is for renovations going forward. Shepherd has been doing its renovations in phases (before this calculation came out). Grosso's comments are clear that it's all political.


It's also all political coming from Bowser. They're all waving around these objective measures as if it means something, but then running with political motivations anyway. At least Bowser is pretty obvious about her politics. Grosso pretends to be fair and balanced.


Agreed. In theory, I like the idea of a truly objective tool to determine need for renovations across the city. But, how do we know this is a valid tool? How was the tool was developed, and by whom exactly? Was it vetted prior to being implemented? What are the "4,200 data points" that went into the tool? Without transparency, it seems like more smoke and mirrors.

Further, I understand that exceptions were made for some schools that the Committee felt had "overriding factors"--to me this suggest either that a) the tool itself isn't as comprehensive as it could be, or b) the variables that went into the formula are softer/more subjective than what has been suggested by the Committee.

Also, some inaccuracies have been noted in Shepherd's rankings--for example, they rank enrollment as flat or negative over the past 5 years--that can't be accurate. Enrollment has grown from about 330 to 350 in the past two years since my child has been enrolled. Also, a PK3 class was added two years ago, and two more classes will be added next year due to demand (PK3 and 1st grade). Demand is growing, as suggested by waitlists. If they didn't take into account future growth projections, then the tool itself is flawed.

The variables that went into the tool are on the last page here:

https://www.documentcloud.org/documents/2830804-2016-Facilities-Analysis-With-Key-for-Dist.html
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:

Have you read the entire thread? Shepherd is getting zeroed out with no promise to restore. Again, this calculation is for renovations going forward. Shepherd has been doing its renovations in phases (before this calculation came out). Grosso's comments are clear that it's all political.


It's also all political coming from Bowser. They're all waving around these objective measures as if it means something, but then running with political motivations anyway. At least Bowser is pretty obvious about her politics. Grosso pretends to be fair and balanced.


Agreed. In theory, I like the idea of a truly objective tool to determine need for renovations across the city. But, how do we know this is a valid tool? How was the tool was developed, and by whom exactly? Was it vetted prior to being implemented? What are the "4,200 data points" that went into the tool? Without transparency, it seems like more smoke and mirrors.

Further, I understand that exceptions were made for some schools that the Committee felt had "overriding factors"--to me this suggest either that a) the tool itself isn't as comprehensive as it could be, or b) the variables that went into the formula are softer/more subjective than what has been suggested by the Committee.

Also, some inaccuracies have been noted in Shepherd's rankings--for example, they rank enrollment as flat or negative over the past 5 years--that can't be accurate. Enrollment has grown from about 330 to 350 in the past two years since my child has been enrolled. Also, a PK3 class was added two years ago, and two more classes will be added next year due to demand (PK3 and 1st grade). Demand is growing, as suggested by waitlists. If they didn't take into account future growth projections, then the tool itself is flawed.

The variables that went into the tool are on the last page here:

https://www.documentcloud.org/documents/2830804-2016-Facilities-Analysis-With-Key-for-Dist.html


The tool was vetted by Grosso's staff, what more do you want?
Anonymous
Anonymous wrote:Also, some inaccuracies have been noted in Shepherd's rankings--for example, they rank enrollment as flat or negative over the past 5 years--that can't be accurate. Enrollment has grown from about 330 to 350 in the past two years since my child has been enrolled. Also, a PK3 class was added two years ago, and two more classes will be added next year due to demand (PK3 and 1st grade). Demand is growing, as suggested by waitlists. If they didn't take into account future growth projections, then the tool itself is flawed.

The variables that went into the tool are on the last page here:

https://www.documentcloud.org/documents/2830804-2016-Facilities-Analysis-With-Key-for-Dist.html

What's so inaccurate? Shepherd gets a rating of 1 in the community population growth factor. According to the last page, that means the community population of 4-10 year olds is expected to have less than 14.7% increase. That's certainly not flat. The rating doesn't mean demand isn't growing; it just means it's growing slower than other areas, such as Orr Elementary where they expect the population to grow by 25-32%. Do you have data showing there will be 15% or more increase in 4-10 year olds in the coming years in the Shepherd neighborhood?

Look, I get that it's frustrating, and I get that there's always fear of a political fix. I also am sure there's plenty of basis to quibble with some of the ratings on the spreadsheet, or at least question where they came from. But from where I sit, it seems this objective data-driven approach is a lot more logical than just individual council members pitching various projects, which is about as pure politics as I could imagine.
Anonymous
Anonymous wrote:
Anonymous wrote:Also, some inaccuracies have been noted in Shepherd's rankings--for example, they rank enrollment as flat or negative over the past 5 years--that can't be accurate. Enrollment has grown from about 330 to 350 in the past two years since my child has been enrolled. Also, a PK3 class was added two years ago, and two more classes will be added next year due to demand (PK3 and 1st grade). Demand is growing, as suggested by waitlists. If they didn't take into account future growth projections, then the tool itself is flawed.

The variables that went into the tool are on the last page here:

https://www.documentcloud.org/documents/2830804-2016-Facilities-Analysis-With-Key-for-Dist.html

What's so inaccurate? Shepherd gets a rating of 1 in the community population growth factor. According to the last page, that means the community population of 4-10 year olds is expected to have less than 14.7% increase. That's certainly not flat. The rating doesn't mean demand isn't growing; it just means it's growing slower than other areas, such as Orr Elementary where they expect the population to grow by 25-32%. Do you have data showing there will be 15% or more increase in 4-10 year olds in the coming years in the Shepherd neighborhood?

Look, I get that it's frustrating, and I get that there's always fear of a political fix. I also am sure there's plenty of basis to quibble with some of the ratings on the spreadsheet, or at least question where they came from. But from where I sit, it seems this objective data-driven approach is a lot more logical than just individual council members pitching various projects, which is about as pure politics as I could imagine.


Don't be fooled. How many of the top 5 ranked schools are actually being renovated sooner based on the 2017 budget?
Anonymous
Anonymous wrote:
Anonymous wrote:Also, some inaccuracies have been noted in Shepherd's rankings--for example, they rank enrollment as flat or negative over the past 5 years--that can't be accurate. Enrollment has grown from about 330 to 350 in the past two years since my child has been enrolled. Also, a PK3 class was added two years ago, and two more classes will be added next year due to demand (PK3 and 1st grade). Demand is growing, as suggested by waitlists. If they didn't take into account future growth projections, then the tool itself is flawed.

The variables that went into the tool are on the last page here:

https://www.documentcloud.org/documents/2830804-2016-Facilities-Analysis-With-Key-for-Dist.html

What's so inaccurate? Shepherd gets a rating of 1 in the community population growth factor. According to the last page, that means the community population of 4-10 year olds is expected to have less than 14.7% increase. That's certainly not flat. The rating doesn't mean demand isn't growing; it just means it's growing slower than other areas, such as Orr Elementary where they expect the population to grow by 25-32%. Do you have data showing there will be 15% or more increase in 4-10 year olds in the coming years in the Shepherd neighborhood?

Look, I get that it's frustrating, and I get that there's always fear of a political fix. I also am sure there's plenty of basis to quibble with some of the ratings on the spreadsheet, or at least question where they came from. But from where I sit, it seems this objective data-driven approach is a lot more logical than just individual council members pitching various projects, which is about as pure politics as I could imagine.


13:56 again. I looked again at the spreadsheet, and I'm guessing what you're complaining about is not the 1 rating in community population growth, but rather the 3 rating Shepherd got in the "5 year average annual enrollment growth" category. That 3 rating translates to (1.7%)-0.1% growth - basically flat as you suggest. Is that what you're referring to?

If so, I think you should check the DCPS audited enrollment data at the links below. It shows Shepherd with enrollment of 331 in 2011, and 330 in 2016, which fits exactly with the 3 rating.
http://osse.dc.gov/enrollment
http://osse.dc.gov/sites/default/files/dc/sites/osse/publication/attachments/Enrollment%20Audit%20Examination%20Report%202011_2012%20%282%29%5B1%5D.pdf
http://osse.dc.gov/sites/default/files/dc/sites/osse/publication/attachments/SY%202015-16%20School-by-School%20Enrollment%20Audit%20Data%20%28Updated%29.pdf
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:

Have you read the entire thread? Shepherd is getting zeroed out with no promise to restore. Again, this calculation is for renovations going forward. Shepherd has been doing its renovations in phases (before this calculation came out). Grosso's comments are clear that it's all political.


It's also all political coming from Bowser. They're all waving around these objective measures as if it means something, but then running with political motivations anyway. At least Bowser is pretty obvious about her politics. Grosso pretends to be fair and balanced.


Agreed. In theory, I like the idea of a truly objective tool to determine need for renovations across the city. But, how do we know this is a valid tool? How was the tool was developed, and by whom exactly? Was it vetted prior to being implemented? What are the "4,200 data points" that went into the tool? Without transparency, it seems like more smoke and mirrors.

Further, I understand that exceptions were made for some schools that the Committee felt had "overriding factors"--to me this suggest either that a) the tool itself isn't as comprehensive as it could be, or b) the variables that went into the formula are softer/more subjective than what has been suggested by the Committee.

Also, some inaccuracies have been noted in Shepherd's rankings--for example, they rank enrollment as flat or negative over the past 5 years--that can't be accurate. Enrollment has grown from about 330 to 350 in the past two years since my child has been enrolled. Also, a PK3 class was added two years ago, and two more classes will be added next year due to demand (PK3 and 1st grade). Demand is growing, as suggested by waitlists. If they didn't take into account future growth projections, then the tool itself is flawed.

The variables that went into the tool are on the last page here:

https://www.documentcloud.org/documents/2830804-2016-Facilities-Analysis-With-Key-for-Dist.html


This is pretty humorous. Shepherd gets dinged in this analysis for three main reasons:

1) Recent renovation (25%) - Duh, the renovation is ongoing
2) Child population growth (5%) - incorrect data used here
3) Feeder modernization (5%) - Too many other Wilson feeders have already been renovated.

Changing the rules in the 8th inning of the game is obviously the biggest factor here.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:Also, some inaccuracies have been noted in Shepherd's rankings--for example, they rank enrollment as flat or negative over the past 5 years--that can't be accurate. Enrollment has grown from about 330 to 350 in the past two years since my child has been enrolled. Also, a PK3 class was added two years ago, and two more classes will be added next year due to demand (PK3 and 1st grade). Demand is growing, as suggested by waitlists. If they didn't take into account future growth projections, then the tool itself is flawed.

The variables that went into the tool are on the last page here:

https://www.documentcloud.org/documents/2830804-2016-Facilities-Analysis-With-Key-for-Dist.html

What's so inaccurate? Shepherd gets a rating of 1 in the community population growth factor. According to the last page, that means the community population of 4-10 year olds is expected to have less than 14.7% increase. That's certainly not flat. The rating doesn't mean demand isn't growing; it just means it's growing slower than other areas, such as Orr Elementary where they expect the population to grow by 25-32%. Do you have data showing there will be 15% or more increase in 4-10 year olds in the coming years in the Shepherd neighborhood?

Look, I get that it's frustrating, and I get that there's always fear of a political fix. I also am sure there's plenty of basis to quibble with some of the ratings on the spreadsheet, or at least question where they came from. But from where I sit, it seems this objective data-driven approach is a lot more logical than just individual council members pitching various projects, which is about as pure politics as I could imagine.


13:56 again. I looked again at the spreadsheet, and I'm guessing what you're complaining about is not the 1 rating in community population growth, but rather the 3 rating Shepherd got in the "5 year average annual enrollment growth" category. That 3 rating translates to (1.7%)-0.1% growth - basically flat as you suggest. Is that what you're referring to?

If so, I think you should check the DCPS audited enrollment data at the links below. It shows Shepherd with enrollment of 331 in 2011, and 330 in 2016, which fits exactly with the 3 rating.
http://osse.dc.gov/enrollment
http://osse.dc.gov/sites/default/files/dc/sites/osse/publication/attachments/Enrollment%20Audit%20Examination%20Report%202011_2012%20%282%29%5B1%5D.pdf
http://osse.dc.gov/sites/default/files/dc/sites/osse/publication/attachments/SY%202015-16%20School-by-School%20Enrollment%20Audit%20Data%20%28Updated%29.pdf


There will be 350 (or 355) students next year.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:

Have you read the entire thread? Shepherd is getting zeroed out with no promise to restore. Again, this calculation is for renovations going forward. Shepherd has been doing its renovations in phases (before this calculation came out). Grosso's comments are clear that it's all political.


It's also all political coming from Bowser. They're all waving around these objective measures as if it means something, but then running with political motivations anyway. At least Bowser is pretty obvious about her politics. Grosso pretends to be fair and balanced.


Agreed. In theory, I like the idea of a truly objective tool to determine need for renovations across the city. But, how do we know this is a valid tool? How was the tool was developed, and by whom exactly? Was it vetted prior to being implemented? What are the "4,200 data points" that went into the tool? Without transparency, it seems like more smoke and mirrors.

Further, I understand that exceptions were made for some schools that the Committee felt had "overriding factors"--to me this suggest either that a) the tool itself isn't as comprehensive as it could be, or b) the variables that went into the formula are softer/more subjective than what has been suggested by the Committee.


Also, some inaccuracies have been noted in Shepherd's rankings--for example, they rank enrollment as flat or negative over the past 5 years--that can't be accurate. Enrollment has grown from about 330 to 350 in the past two years since my child has been enrolled. Also, a PK3 class was added two years ago, and two more classes will be added next year due to demand (PK3 and 1st grade). Demand is growing, as suggested by waitlists. If they didn't take into account future growth projections, then the tool itself is flawed.

The variables that went into the tool are on the last page here:

https://www.documentcloud.org/documents/2830804-2016-Facilities-Analysis-With-Key-for-Dist.html


This is pretty humorous. Shepherd gets dinged in this analysis for three main reasons:

1) Recent renovation (25%) - Duh, the renovation is ongoing
2) Child population growth (5%) - incorrect data used here
3) Feeder modernization (5%) - Too many other Wilson feeders have already been renovated.

Changing the rules in the 8th inning of the game is obviously the biggest factor here.


This. We aren't at Shepherd but anyone can say it's not a fair process to penalize a school and apply a brand new calculation on a job that started before Grosso was even in office. The funds were already there. One can't give and then take away. Again, Grosso's remarks about the "political neighborhood" are appalling and show his true colors. The Post should pick this up (specifically, not in conjunction with a larger story).
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:Also, some inaccuracies have been noted in Shepherd's rankings--for example, they rank enrollment as flat or negative over the past 5 years--that can't be accurate. Enrollment has grown from about 330 to 350 in the past two years since my child has been enrolled. Also, a PK3 class was added two years ago, and two more classes will be added next year due to demand (PK3 and 1st grade). Demand is growing, as suggested by waitlists. If they didn't take into account future growth projections, then the tool itself is flawed.

The variables that went into the tool are on the last page here:

https://www.documentcloud.org/documents/2830804-2016-Facilities-Analysis-With-Key-for-Dist.html

What's so inaccurate? Shepherd gets a rating of 1 in the community population growth factor. According to the last page, that means the community population of 4-10 year olds is expected to have less than 14.7% increase. That's certainly not flat. The rating doesn't mean demand isn't growing; it just means it's growing slower than other areas, such as Orr Elementary where they expect the population to grow by 25-32%. Do you have data showing there will be 15% or more increase in 4-10 year olds in the coming years in the Shepherd neighborhood?

Look, I get that it's frustrating, and I get that there's always fear of a political fix. I also am sure there's plenty of basis to quibble with some of the ratings on the spreadsheet, or at least question where they came from. But from where I sit, it seems this objective data-driven approach is a lot more logical than just individual council members pitching various projects, which is about as pure politics as I could imagine.


13:56 again. I looked again at the spreadsheet, and I'm guessing what you're complaining about is not the 1 rating in community population growth, but rather the 3 rating Shepherd got in the "5 year average annual enrollment growth" category. That 3 rating translates to (1.7%)-0.1% growth - basically flat as you suggest. Is that what you're referring to?

If so, I think you should check the DCPS audited enrollment data at the links below. It shows Shepherd with enrollment of 331 in 2011, and 330 in 2016, which fits exactly with the 3 rating.
http://osse.dc.gov/enrollment
http://osse.dc.gov/sites/default/files/dc/sites/osse/publication/attachments/Enrollment%20Audit%20Examination%20Report%202011_2012%20%282%29%5B1%5D.pdf
http://osse.dc.gov/sites/default/files/dc/sites/osse/publication/attachments/SY%202015-16%20School-by-School%20Enrollment%20Audit%20Data%20%28Updated%29.pdf


PP here. Yes! That's what I was referring to--enrollment growth of a '3.' I was told recently that enrollment currently stands at 350+ (although I can't immediately verify this). Now, that doesn't fit with the data you've provided via the OSSE links above, but I wonder if that 2016 official estimate is off for some reason.

I agree that a data-driven approach is preferable, but I just want more information on how the data points were chosen, how weights were assigned to the various categories, etc. I don't need to see the entire blow-by-blow methodology, but it would be nice to see at least an overview of the process. Also, as a PP mentioned, any tool that penalizes a school and moves it further down the list for already having a renovation underway is not a good tool, IMO.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:Also, some inaccuracies have been noted in Shepherd's rankings--for example, they rank enrollment as flat or negative over the past 5 years--that can't be accurate. Enrollment has grown from about 330 to 350 in the past two years since my child has been enrolled. Also, a PK3 class was added two years ago, and two more classes will be added next year due to demand (PK3 and 1st grade). Demand is growing, as suggested by waitlists. If they didn't take into account future growth projections, then the tool itself is flawed.

The variables that went into the tool are on the last page here:

https://www.documentcloud.org/documents/2830804-2016-Facilities-Analysis-With-Key-for-Dist.html

What's so inaccurate? Shepherd gets a rating of 1 in the community population growth factor. According to the last page, that means the community population of 4-10 year olds is expected to have less than 14.7% increase. That's certainly not flat. The rating doesn't mean demand isn't growing; it just means it's growing slower than other areas, such as Orr Elementary where they expect the population to grow by 25-32%. Do you have data showing there will be 15% or more increase in 4-10 year olds in the coming years in the Shepherd neighborhood?

Look, I get that it's frustrating, and I get that there's always fear of a political fix. I also am sure there's plenty of basis to quibble with some of the ratings on the spreadsheet, or at least question where they came from. But from where I sit, it seems this objective data-driven approach is a lot more logical than just individual council members pitching various projects, which is about as pure politics as I could imagine.


13:56 again. I looked again at the spreadsheet, and I'm guessing what you're complaining about is not the 1 rating in community population growth, but rather the 3 rating Shepherd got in the "5 year average annual enrollment growth" category. That 3 rating translates to (1.7%)-0.1% growth - basically flat as you suggest. Is that what you're referring to?

If so, I think you should check the DCPS audited enrollment data at the links below. It shows Shepherd with enrollment of 331 in 2011, and 330 in 2016, which fits exactly with the 3 rating.
http://osse.dc.gov/enrollment
http://osse.dc.gov/sites/default/files/dc/sites/osse/publication/attachments/Enrollment%20Audit%20Examination%20Report%202011_2012%20%282%29%5B1%5D.pdf
http://osse.dc.gov/sites/default/files/dc/sites/osse/publication/attachments/SY%202015-16%20School-by-School%20Enrollment%20Audit%20Data%20%28Updated%29.pdf


PP here. Yes! That's what I was referring to--enrollment growth of a '3.' I was told recently that enrollment currently stands at 350+ (although I can't immediately verify this). Now, that doesn't fit with the data you've provided via the OSSE links above, but I wonder if that 2016 official estimate is off for some reason.

I agree that a data-driven approach is preferable, but I just want more information on how the data points were chosen, how weights were assigned to the various categories, etc. I don't need to see the entire blow-by-blow methodology, but it would be nice to see at least an overview of the process. Also, as a PP mentioned, any tool that penalizes a school and moves it further down the list for already having a renovation underway is not a good tool, IMO.


Is there room in the building for this growth?
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:

Have you read the entire thread? Shepherd is getting zeroed out with no promise to restore. Again, this calculation is for renovations going forward. Shepherd has been doing its renovations in phases (before this calculation came out). Grosso's comments are clear that it's all political.


It's also all political coming from Bowser. They're all waving around these objective measures as if it means something, but then running with political motivations anyway. At least Bowser is pretty obvious about her politics. Grosso pretends to be fair and balanced.


Agreed. In theory, I like the idea of a truly objective tool to determine need for renovations across the city. But, how do we know this is a valid tool? How was the tool was developed, and by whom exactly? Was it vetted prior to being implemented? What are the "4,200 data points" that went into the tool? Without transparency, it seems like more smoke and mirrors.

Further, I understand that exceptions were made for some schools that the Committee felt had "overriding factors"--to me this suggest either that a) the tool itself isn't as comprehensive as it could be, or b) the variables that went into the formula are softer/more subjective than what has been suggested by the Committee.

Also, some inaccuracies have been noted in Shepherd's rankings--for example, they rank enrollment as flat or negative over the past 5 years--that can't be accurate. Enrollment has grown from about 330 to 350 in the past two years since my child has been enrolled. Also, a PK3 class was added two years ago, and two more classes will be added next year due to demand (PK3 and 1st grade). Demand is growing, as suggested by waitlists. If they didn't take into account future growth projections, then the tool itself is flawed.

The variables that went into the tool are on the last page here:

https://www.documentcloud.org/documents/2830804-2016-Facilities-Analysis-With-Key-for-Dist.html


This is pretty humorous. Shepherd gets dinged in this analysis for three main reasons:

1) Recent renovation (25%) - Duh, the renovation is ongoing
2) Child population growth (5%) - incorrect data used here
3) Feeder modernization (5%) - Too many other Wilson feeders have already been renovated.

Changing the rules in the 8th inning of the game is obviously the biggest factor here.


Isn't it a huge improvement that we cant attack the data instead of each other?
Anonymous
Anonymous wrote:
Anonymous wrote:... If so, I think you should check the DCPS audited enrollment data at the links below. It shows Shepherd with enrollment of 331 in 2011, and 330 in 2016, which fits exactly with the 3 rating.
http://osse.dc.gov/enrollment
http://osse.dc.gov/sites/default/files/dc/sites/osse/publication/attachments/Enrollment%20Audit%20Examination%20Report%202011_2012%20%282%29%5B1%5D.pdf
http://osse.dc.gov/sites/default/files/dc/sites/osse/publication/attachments/SY%202015-16%20School-by-School%20Enrollment%20Audit%20Data%20%28Updated%29.pdf


PP here. Yes! That's what I was referring to--enrollment growth of a '3.' I was told recently that enrollment currently stands at 350+ (although I can't immediately verify this). Now, that doesn't fit with the data you've provided via the OSSE links above, but I wonder if that 2016 official estimate is off for some reason.

I agree that a data-driven approach is preferable, but I just want more information on how the data points were chosen, how weights were assigned to the various categories, etc. I don't need to see the entire blow-by-blow methodology, but it would be nice to see at least an overview of the process. Also, as a PP mentioned, any tool that penalizes a school and moves it further down the list for already having a renovation underway is not a good tool, IMO.

On your first point, I can't speak to what you heard about current estimated enrollment being 350+, or about PP's expectation that enrollment next year will be higher. I'd guess that DCPS and the Council used audited enrollment numbers rather than estimated enrollments to prevent schools from gaming the system for extra funds. If next year's audited enrollment numbers are higher, then presumably Shepherd might move up in the rankings and get a bigger piece of the funding pie in the future.

I'm glad we agree on the data driven approach. I feel like the spreadsheet - while admittedly complex - gives a pretty thorough overview of how the ranking process was conducted.

On your point about how an ongoing renovation affects school rankings, I can see both sides of that issue. On one hand, if you're at Shepherd, I can understand you're frustrated that you might get only 90% of the renovation someone promised you several years ago. On the other hand, if you're at another school that's never had a renovation at all and is rated as being in poor condition, you'd be pretty pissed to hear that you're not getting money because it's all flowing to some long-ago promised renovation (which itself was perhaps part of a political backscratching deal!) at a school in a wealthy part of Ward 4 that's already got facilities rated as "good." The short answer is that there's never enough money to do everything people want, so there needs to be a system to allocate it fairly.
Anonymous
Anonymous wrote:
Anonymous wrote:
Anonymous wrote:... If so, I think you should check the DCPS audited enrollment data at the links below. It shows Shepherd with enrollment of 331 in 2011, and 330 in 2016, which fits exactly with the 3 rating.
http://osse.dc.gov/enrollment
http://osse.dc.gov/sites/default/files/dc/sites/osse/publication/attachments/Enrollment%20Audit%20Examination%20Report%202011_2012%20%282%29%5B1%5D.pdf
http://osse.dc.gov/sites/default/files/dc/sites/osse/publication/attachments/SY%202015-16%20School-by-School%20Enrollment%20Audit%20Data%20%28Updated%29.pdf


PP here. Yes! That's what I was referring to--enrollment growth of a '3.' I was told recently that enrollment currently stands at 350+ (although I can't immediately verify this). Now, that doesn't fit with the data you've provided via the OSSE links above, but I wonder if that 2016 official estimate is off for some reason.

I agree that a data-driven approach is preferable, but I just want more information on how the data points were chosen, how weights were assigned to the various categories, etc. I don't need to see the entire blow-by-blow methodology, but it would be nice to see at least an overview of the process. Also, as a PP mentioned, any tool that penalizes a school and moves it further down the list for already having a renovation underway is not a good tool, IMO.

On your first point, I can't speak to what you heard about current estimated enrollment being 350+, or about PP's expectation that enrollment next year will be higher. I'd guess that DCPS and the Council used audited enrollment numbers rather than estimated enrollments to prevent schools from gaming the system for extra funds. If next year's audited enrollment numbers are higher, then presumably Shepherd might move up in the rankings and get a bigger piece of the funding pie in the future.

I'm glad we agree on the data driven approach. I feel like the spreadsheet - while admittedly complex - gives a pretty thorough overview of how the ranking process was conducted.

On your point about how an ongoing renovation affects school rankings, I can see both sides of that issue. On one hand, if you're at Shepherd, I can understand you're frustrated that you might get only 90% of the renovation someone promised you several years ago. On the other hand, if you're at another school that's never had a renovation at all and is rated as being in poor condition, you'd be pretty pissed to hear that you're not getting money because it's all flowing to some long-ago promised renovation (which itself was perhaps part of a political backscratching deal!) at a school in a wealthy part of Ward 4 that's already got facilities rated as "good." The short answer is that there's never enough money to do everything people want, so there needs to be a system to allocate it fairly.


So you advocate the most needy school getting funds first but only finish the job to renovate that school to a point where they'd not needy anymore (say 50% finished) and then they move to the end of the line? Then next school 50% done, etc etc? Or too bad for Shepherd for being started under the old system?
Anonymous
Anonymous wrote:This. We aren't at Shepherd but anyone can say it's not a fair process to penalize a school and apply a brand new calculation on a job that started before Grosso was even in office. The funds were already there. One can't give and then take away.


That makes no sense to me. If someone said "Well, we promised Ellington a $200 million renovation with gold-plated urinals 5 years ago, and we aren't allowed to change our minds, so I guess all the other crumbling schools will just have to wait," no one would accept that. Budgets always change based on current needs, and those needs are always changing. Having a data-driven tool at least helps minimize the weight of politics in all this.
Anonymous
Anonymous wrote:
Anonymous wrote:This. We aren't at Shepherd but anyone can say it's not a fair process to penalize a school and apply a brand new calculation on a job that started before Grosso was even in office. The funds were already there. One can't give and then take away.


That makes no sense to me. If someone said "Well, we promised Ellington a $200 million renovation with gold-plated urinals 5 years ago, and we aren't allowed to change our minds, so I guess all the other crumbling schools will just have to wait," no one would accept that. Budgets always change based on current needs, and those needs are always changing. Having a data-driven tool at least helps minimize the weight of politics in all this.


So making a cafeteria and gym ADA compliant is same as gold plates urinals? So what is Oyster going to do with the $4M?
post reply Forum Index » DC Public and Public Charter Schools
Message Quick Reply
Go to: